Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I want to read data from two lakehouses and write it to another lakehouse but during notebook run i've to set one default lakehouse so is there any possible way to do this.
Hi @gunjankabra ,
Thanks for using Fabric Community. Yes it is possible to read data from default and write data in another lakehouse.
Reading Data from Default Lakehouse -
df = spark.sql("SELECT * FROM gopi_lake_house.customer_table1 LIMIT 1000")
display(df)
Writing Data to different lakehoue -
df.write.format("delta").saveAsTable("gopi_test_lakehouse.sales_external")
Hope this is helpful. Please let me know incase of further queries.
Hi @gunjankabra ,
We havenโt heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
i want to read data from two different lakehouses in single notebook and write to another
Hi @gunjankabra ,
Please correct if my understanding is wrong.
As per my understanding, you can read tables from different lakehouse in single notebook and can save it as table in different lakehouse.
Note: Default Lakehouse is gopi_bronze here.
Hope this is helpful. Please let me know incase of further queries.
Hello @gunjankabra ,
We havenโt heard from you on the last response and was just checking back to see if you have a resolution yet .
Otherwise, will respond back with the more details and we will try to help .
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!