Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
ivamsikrishna
New Contributor II

Fabric Notebook read data from Lakehouse folder and write (Overwrite/Append) to Warehouse Table

Hi, we have a lakehouse and warehouse. In lakehouse folders with list of parquet fiels. Idea is we wanted to access these files and copy them into a warehouse table (Append). I know how to read folders reading all .parquet files using spark.read.parquet option however how do I write these files into a table in different warehouse. We know how to  create table in same lakehouse but we wanted to create table and copy data in to different warehouse. Please help, below the folder structure for example.

Lakehouse1 -> Files -> MasterFolder->ChildFolder->(list of .parquet files)

Warehouse -> Schema -> dbo. -> Tables ->Tablename

Thank you. 

1 ACCEPTED SOLUTION

It worked well, appreciate for help. 

View solution in original post

2 REPLIES 2
nilendraFabric
Honored Contributor

Hi @ivamsikrishna 

 

 

Once you have the DataFrame from the lakehouse, you can write it to a warehouse table residing in a different workspace  using the Fabric Spark connector for Data Warehouse.
If you want to append the data to an existing table (or create the table if it does not exist), you can use the `synapsesql` method with a specified write mode. 


df.write.mode("append").synapsesql("<warehouse_name>.dbo.Tablename")


 

If your warehouse is in a different workspace from the one running your notebook, specify the target workspace ID using an appropriate option. For example, you can add:

 

df.write.option(Constants.WorkspaceId, "<target_workspace_id>").mode("append").synapsesql("<warehouse_name>.dbo.Tablename")

 

 

 

 

 

It worked well, appreciate for help. 

Helpful resources

Announcements
Users online (27)