Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
SyedN
New Contributor II

android App - One Lake Integration.

Hi All, 

I have built an app that will store the SMS and Call logs of my phone and saved them in a Json format i am trying to upload that Json file directly from my APp programatically ( using Java ) to One Lake. I have successfully able to get the SAS token but i am stuck after tht step , with aavailable SAS token to my OneLake how can i upload the Json file to OneLake using Java ?

 

I have looked in One Lake documentation its shows using PYthon which i am not familiar with so can any one help me out in this scenario please. 

 

 

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @SyedN,

Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:

Load data to MS Fabric Warehouse from notebook - Stack Overflow

connect to fabric lakehouses warehouses from python code 

You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.

If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.

Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn

How to copy data using copy activity - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

View solution in original post

1 REPLY 1
Anonymous
Not applicable

HI @SyedN,

Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:

Load data to MS Fabric Warehouse from notebook - Stack Overflow

connect to fabric lakehouses warehouses from python code 

You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.

If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.

Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn

How to copy data using copy activity - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Users online (27)