Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Arnab0111
New Contributor III

Save a pyspark dataframe in a table in warehouse using notebook

I am using notebook and have a pyspark dataframe , please guide me in saving the same as a overwrite mode table in warehouse inside a custom schema.

1 ACCEPTED SOLUTION
puneetvijwani
Contributor II

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



View solution in original post

3 REPLIES 3
puneetvijwani
Contributor II

@Arnab0111 As of now its not possible to save Pysparkdataframe to schema other than dbo ( if you choose to save them in Tables section for lakehouse) 

You can create a viw from your lakehouse to custom schema or use sql endpoint of lakehose and create view in your datawarehouse under Custom Schema 

Ps: If my response helped, kindly select it as the solution. Your kudos are greatly appreciated!

Can you please guide me in saving to default dbo schema only

puneetvijwani
Contributor II

@Arnab0111 Would suggest if you can navigate to data enginnering space from bottom left and go to Use sample section ,you will see the data engeering starter kit , it has many examples from writing t& transformation which will guide you how you can save table as default in Table section in a Lakehouse and data prep from data engineering point of view , which will save the table in default catalog of lakehouse 

After that Open the Data warehouse and add your Lakehouse sql end point 

epunvij_0-1693315706438.png

 


Query the DBO Schema table from lakehouse and save as view in to your Custom Schma of your data warehouse

epunvij_1-1693315828900.png

 



Helpful resources

Announcements
Top Solution Authors
Top Kudoed Authors
Users online (11,086)