Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I have a NB that is attached to a lakehouse and I am executing the following which works fine
%%pyspark
from delta.tables import *
tbl_name = "fact_sales"
tbl_path = "Tables/"+tbl_name
delta_table = DeltaTable.forPath(spark, tbl_path)
When I am making Save as Copy and the try to run the copied NB, it fails.
It is showing
AnalysisException: Tables/fact_sales is not a Delta table; which is absolutely false.
Why is it failing on copied notebook (the copy still shows the NB is attached to the same LH as the original) where it perfectly executes on the original
Solved! Go to Solution.
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)
HI @smpa01,
How did you configure the default lakehouse and environment setting? Can you please share some more detail about your operations?
I copy your code and try to use 'save as copy' feature to create notebook, the two notebook all works well. (I pin the default lakehouse and confirm the used delta table is exist in Tables level)
Regards,
Xiaoxin Sheng
This worked
tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name
#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)