Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hello,
Im writting a Fabric notebook to load a Parquet file into Fabric Warehouse.
I want this isn Pysparks only as the notebook is fully written using the PySparks. Please advise me the fast & efficient way to load the data into Warehouse from lakehouse.
My request is, If table exist, i want to upsert, if not, i want to create a table and insert the data.
Please kindly share code
Solved! Go to Solution.
Hi @westf,
Perhaps you can take a look at the following link to use notebook load data to data warehouse.
Load data to MS Fabric Warehouse from notebook - Stack Overflow
Regards,
Xiaoxin Sheng
Below is the sample code that we use :
# Load the Parquet file into a Spark DataFrame
df = spark.read.parquet("path/to/your/parquet/file")
# Write the DataFrame to the Fabric Data Warehouse df.write.mode("overwrite").saveAsTable("your_table_name")
Now based on what I know, you can either append or overwrite data in table directly.
I am not sure w.r.t upsert, need to validate myself
Note: you can use Spark SQL for upsert
What I meant is I am not sure myself whether it is doable via pyspark
Hi @westf,
Perhaps you can take a look at the following link to use notebook load data to data warehouse.
Load data to MS Fabric Warehouse from notebook - Stack Overflow
Regards,
Xiaoxin Sheng
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!