Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Karthick_Balaje
New Contributor II

How to Ingest Stored procedures from Azure SQL DM to Fabric Onelake/Bronze lake House

Hi All!

I have been stuck with this blocker while working on a task. I wanted to use a stored procedure and TimeValue Function from Azure SQL DB and build a KPI in Fabric. But I am stuck with this blocker for over 4 to 5 hours, so I thought to post it here. And my task needs to be done so that Power BI resource can work on the visualization. How to approach this issue? 

Thanks and Regards!
Karthick

2 ACCEPTED SOLUTIONS
Shahid12523
Honored Contributor

You canโ€™t ingest stored procedure results directly into Fabric Lakehouse.
Workarounds:

Best โ†’ Convert SP logic into a view and ingest via Dataflow Gen2/Pipeline.

Else โ†’ Make SP insert results into a staging table, then copy that table to Bronze Lakehouse.

Pipeline option โ†’ Use Stored Procedure activity (if enabled) to execute SP and land output in Lakehouse.

Transform in Fabric โ†’ Replicate SP logic (like TimeValue) in Dataflow Gen2 instead of DB.

Shahed Shaikh

View solution in original post

anilgavhane
Contributor

@Karthick_Balaje  

1. Convert Stored Procedure Logic into a View

  • Create a SQL view that replicates the logic of your stored procedure.
  • Use Dataflow Gen2 or a Pipeline in Fabric to ingest the view into your Lakehouse.
  • This is the cleanest and most scalable method.

2. Use a Staging Table

  • Modify your stored procedure to insert results into a staging table.
  • Then use a Copy Data activity in Fabric to move that table into your Bronze Lakehouse.

3. Use Script Activity in a Pipeline

4. Replicate Logic in Fabric

  • If the stored procedure uses functions like TimeValue, consider replicating that logic in Dataflow Gen2 using Power Query M.
  • This avoids dependency on SQL Server functions and keeps the transformation native to Fabric.

 

Next Steps

  • Choose the method that best fits your architecture and governance model.
  • Once the data lands in the Lakehouse, Power BI can connect directly to the default dataset or SQL endpoint for visualization.

View solution in original post

7 REPLIES 7
spaceman127
Contributor

Hello @Karthick_Balaje ,

 

So, if I understand your headline correctly, you want to use Azure SQL Proc to load data directly into a lakehouse.

- Is there an error message?

- Do you use a data pipeline?

 

There are several ways to achieve your goal.

Here is an example of how to load data into the lakehouse.

 

https://learn.microsoft.com/en-us/fabric/data-factory/tutorial-move-data-lakehouse-pipeline

 

Please give us a little more information so we can help you.

 

Best regards

Shahid12523
Honored Contributor

You canโ€™t ingest stored procedure results directly into Fabric Lakehouse.
Workarounds:

Best โ†’ Convert SP logic into a view and ingest via Dataflow Gen2/Pipeline.

Else โ†’ Make SP insert results into a staging table, then copy that table to Bronze Lakehouse.

Pipeline option โ†’ Use Stored Procedure activity (if enabled) to execute SP and land output in Lakehouse.

Transform in Fabric โ†’ Replicate SP logic (like TimeValue) in Dataflow Gen2 instead of DB.

Shahed Shaikh
v-prasare
Honored Contributor II

Hi @Karthick_Balaje,

We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.

 

@Shahid12523@spaceman127, Thanks for your prompt response

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are

anilgavhane
Contributor

@Karthick_Balaje  

1. Convert Stored Procedure Logic into a View

  • Create a SQL view that replicates the logic of your stored procedure.
  • Use Dataflow Gen2 or a Pipeline in Fabric to ingest the view into your Lakehouse.
  • This is the cleanest and most scalable method.

2. Use a Staging Table

  • Modify your stored procedure to insert results into a staging table.
  • Then use a Copy Data activity in Fabric to move that table into your Bronze Lakehouse.

3. Use Script Activity in a Pipeline

4. Replicate Logic in Fabric

  • If the stored procedure uses functions like TimeValue, consider replicating that logic in Dataflow Gen2 using Power Query M.
  • This avoids dependency on SQL Server functions and keeps the transformation native to Fabric.

 

Next Steps

  • Choose the method that best fits your architecture and governance model.
  • Once the data lands in the Lakehouse, Power BI can connect directly to the default dataset or SQL endpoint for visualization.

v-prasare
Honored Contributor II

Hi @Karthick_Balaje,

We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.

 

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are

My Query is Resolved, and Thank you @anilgavhane @v-prasare @Shahid12523  for the responses!

v-prasare
Honored Contributor II

May I ask if you have resolved this issue? If so, Can you please share the resolution steps here. This will be helpful for other community members who have similar problems to solve it faster.
If we donโ€™t hear back, weโ€™ll go ahead and close this thread. For any further discussions or questions, please start a new thread in the Microsoft Fabric Community Forum weโ€™ll be happy to assist.
Thank you for being part of the Microsoft Fabric Community.

Helpful resources

Announcements
Users online (10,584)