Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
ramonsuarez
New Contributor III

Programmatically upload files to Fabric?

Hi, 

 

We want to be able to periodically upload a set of files to fabric, always to the same folder and overwritting the previous versions of those files. 

 

We have a PowerShell script in a server that we can run to extract the data (xml). Then it saves the files to SharePoint, but we would like to save them directly into our LakeHouse without intermediary steps. 

 

I've checked the API documentation and found how to do many operations, but not how to upload files directly to a folder in the Lakehouse. 

 

I've also found that service principals are not supported when using the Microsoft Fabric REST APIs. 

 

Can you help? Thanks!

1 ACCEPTED SOLUTION

Hi @ramonsuarez ,

You can also try installing 'OneLake' program in the server and you can use it just like how you use 'OneDrive'. This approach will require just adding a copy statement in your existing powershell script 

govindarajan_d_0-1706618884351.png

 

View solution in original post

8 REPLIES 8
v-cboorla-msft
Honored Contributor II

Hi @ramonsuarez 

 

Thanks for the ask and using the Microsoft Fabric Community.


At this time, we are reaching out to the internal team to get some help on this .
We will update you once we hear back from them.

Appreciate your patience.

 

Thanks

 

Hi @ramonsuarez 

 

Could you please confirm whether you are using Notebook or Pipeline?

What is the source of your data, will it be accessible from Notebook?

 

Thanks

Neither. The script runs on a server and extracts the tables from the source system as xml files. 

Hi @ramonsuarez 

 

There are multiple ways to upload files to OneLake. You can try these options:

  1. Pipelines (as Sumit Kumar pointed out earlier)
  2. AZCOPY (they call the CLI and it copies files to /Tables directory). There's a good blog covering the details available here. Ingest Data into Microsoft Fabric OneLake using AzCopy | by Inderjit Rana | Microsoft Azure | Medium
  3. APIs

I hope this information helps. Please do let us know if you have any further questions.

 

Thanks

Hi @ramonsuarez 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.


Thanks

A colleague found a Power Shell script to upload directly from the server and run it with Windows Task Scheduler.

Hi @ramonsuarez ,

You can also try installing 'OneLake' program in the server and you can use it just like how you use 'OneDrive'. This approach will require just adding a copy statement in your existing powershell script 

govindarajan_d_0-1706618884351.png

 

Hi @ramonsuarez , would you be able to share the PS script please? Would be very useful.

Helpful resources

Announcements
Users online (10,084)