Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I got bunch of .delta table which in need to save as .json and zip them to supply it thrid party tool for futher processing.
I was able to get all .delta tables into .json files, but getting erro when zipping them.
My .json files are in folder
'Files/temp/consolidated/'
Now i want them to zip them all into 'sample.zip' ... he is my code, not sure what am doing wrong zipf.write can find the json_file_path though is right. Any clues?? or is there different way to do it?
with zipfile.ZipFile('sample.zip', 'w') as zipf:
for f in filenames:
json_file_path = f"Files/temp/consolidated/{f}"
print (json_file_path)
zipf.write(json_file_path,arcname=f)
Microsoft Fabric Context: The Fabric environment uses a virtual file system, which doesn’t provide direct file path access as in traditional file systems.
zipfile.ZipFile.write Limitation: This method expects a direct file path, which isn't available for Fabric paths.
not sure what am doing wrong
It's pretty clear that what you are trying to do is not supported. Don't use zipFile - use its in-memory equivalent, whatever that is.
Thanks lbendlin for the reply, do you got any sample code using in-memory to do this zipping for files or any links.
Maybe you can mount the lakehouse through the Microsoft Spark Utilities package? Mount would allow you to use the local file API to access data under the mount point as if it's stored in the local file system.
Access files under the mount point via local path
I haven't used it to create zip files, you can give it a try.
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Have you resolved this issue? If any of the answers provided were helpful, please consider accepting them as a solution. If you have found other solutions, we would greatly appreciate it if you could share them with us. Thank you!
Best Regards,
Jing
Why not use a copy command in a pipeline, you can iterate over all the table you want.
Delta Table as you source
OneLake File as your destination json file type, set the compression type to .zip
Proud to be a Super User!