Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
smpa01
Esteemed Contributor

Notebook fails after save as copy

I have a NB that is attached to a lakehouse and I am executing the following which works fine

 

%%pyspark
from delta.tables import *

tbl_name = "fact_sales"
tbl_path = "Tables/"+tbl_name


delta_table = DeltaTable.forPath(spark, tbl_path)

 

When I am making Save as Copy and the try to run the copied NB, it fails.

 

smpa01_0-1722364550943.png

 

It is showing

AnalysisException: Tables/fact_sales is not a Delta table; which is absolutely false.

 

Why is it failing on copied notebook (the copy still shows the NB is attached to the same LH as the original) where it perfectly executes on the original

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION
smpa01
Esteemed Contributor

This worked

 

tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name

#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)

 

 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

HI @smpa01,

How did you configure the default lakehouse and environment setting? Can you please share some more detail about your operations?
I copy your code and try to use 'save as copy' feature to create notebook, the two notebook all works well. (I pin the default lakehouse and confirm the used delta table is exist in Tables level)

1.png2.png

Regards,

Xiaoxin Sheng

smpa01
Esteemed Contributor

This worked

 

tbl_name = "fact_sales"
tbl_name_path = "Tables/"+tbl_name

#delta_table = DeltaTable.forPath(spark, tbl_name_path)
delta_table = DeltaTable.forName(spark, tbl_name)

 

 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Helpful resources

Announcements
Users online (25)