Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
data_quantum
New Contributor

Sharing data progress outside Fabric Notebook

Hi, 

 

I am connecting to a Lakehouse from a Notebook within Microsoft Fabric. I run an analysis on a dataframe, e.g. ydata_profiler using a PySpark, that produces an inline report of my data. Great, but...

 

How do I export or share that report with business users, quickly? The profile report is interactive so a pdf won't suffice. I want to know in general what the best way is of sharing data science progress with stakeholders is, outside of the Fabric Notebook (that is not PowerBI which is incredibly slow, clunky and adds too many extra steps).  

 

In RStudio, I would use a Shiny app. I tried looking at the streamlit python package, but I can't get it to run in Fabric notebook (I might be doing something wrong however).

 

Is there something that will work directly from a Fabric Notebook?

 

Thanks

5 REPLIES 5
data_quantum
New Contributor

Current workflow is:

 

Read Lakehouse table into a Spark datafram in a Notebook

Run any analysis (e.g. a ProfileReport from ydata_profiler package)

Write the profile report to an html file in the Lakehouse

Use OneLake file explorer in windows explorer to view the html file in a browser (requires installing OneLake file explorer (https://learn.microsoft.com/en-us/fabric/onelake/onelake-file-explorer))

 

Kinda works for now

 

DavidDalley66
New Contributor II

Yeah, so the best move is probs exporting your notebook output to OneDrive or Azure Blob Storage and just sharing access from there. You can also save stuff as CSV or Parquet files and plug it into Power BI or whatever dashboard tool you like. Fabric doesnโ€™t let you share live notebook progress yet (rip), but this workaround works fine. Honestly, hoping Microsoft adds a built-in way to share soon would make life way easier for collabs and updates. You can also visit my website.

DavidDalley66
New Contributor II

Yeah, so the best move is probs exporting your notebook output to OneDrive or Azure Blob Storage and just sharing access from there. You can also save stuff as CSV or Parquet files and plug it into Power BI or whatever dashboard tool you like. Fabric doesnโ€™t let you share live notebook progress yet (rip), but this workaround works fine. Honestly, hoping Microsoft adds a built-in way to share soon would make life way easier for collabs and updates. You can also visit my website.

anilgavhane
Contributor

What Doesnโ€™t Work Well

  • PDFs: Static, not suitable for interactive profiling reports.
  • Power BI: Powerful but often overkill for quick sharing, and adds friction.
  • Streamlit inside Fabric: Not currently supported natively within Fabric notebooks due to environment limitations.

 

Best Ways to Share Interactive Reports Outside Fabric

1. Export Data + Host Streamlit Locally

Since Streamlit doesnโ€™t run inside Fabric, you can:

  • Export your processed data (e.g. as CSV or Parquet) from the notebook.
  • Build a local Streamlit app on your laptop using ydata-profiling or pandas-profiling.
  • Share the app via:
  • Internal web server (e.g. localhost or intranet)
  • Streamlit Cloud (for public or semi-private sharing)
  • Docker container (for reproducibility)

This gives you full interactivity and control.

2. Use HTML Export from Profiling Tools

If you're using ydata-profiling, you can export the report as an HTML file:

 

profile = ProfileReport(df) profile.to_file("report.html")

 

Then share the HTML file via email, Teams, or a shared drive. Itโ€™s interactive and lightweightโ€”no need for Streamlit.

3. Fabric + OneLake + External App

  • Store the processed data in OneLake.
  • Build a lightweight web app (Streamlit, Dash, Flask) outside Fabric that reads from OneLake.
  • This separates compute from presentation and avoids tying the report to your notebook session.

deborshi_nag
Contributor

Hello @data_quantum 

 

If you have access to Databricks, which is a first-party Microsoft offering, you can use Databricks Apps to host a Streamlit/Dash/Flask app that can connect to your Fabric Lakehouse SQL endpoint.

 

Here's the link to Microsoft docs

 

Key concepts in Databricks Apps - Azure Databricks | Microsoft Learn

 

Hope this helps - please appreciate by leaving a Kudos or accepting it as a Solution to help others! 

 

 

I trust this will be helpful. If you found this guidance useful, you are welcome to acknowledge with a Kudos or by marking it as a Solution.

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Users online (14,768)