Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
How to bring all Planetary Computer catalog data for a specific region into Microsoft Fabric Lakehouse?
Hi everyone, Iโm currently working on something where I need to bring all available catalog data from the Microsoft Planetary Computer into a Microsoft Fabric Lakehouse, but I want to filter it for a specific region or area of interest.
Iโve been looking around, but Iโm a bit stuck on how to approach this.
I have tried to get data into lakehouse using notebook by using python scripts (with the use of pystac-client, Planetary-computer, adlfs), I have loaded it as .tiff file.
But i wnat to ingest all catalog data for the particular region, is there any bulk data ingestion methodbfor this?
Is there a way to do this using Fabricโs built-in tools, like a native connector or pipelin?
Can this be done using the STAC API and some kind of automation, maybe with Fabric Data Factory or a Fabric Notebook?
Whatโs the best way to handle large-scale ingestion for a whole region? Is there any bulk loading approach that people are using?
Also, any tips on things like storage format, metadata, or authentication between the Planetary Computer and OneLake would be super helpful.
And, finally is there any ways to visualize it in powee bi? (currently planning to use it in web app, but is there any possibility of visualization in power bi?)
Iโd love to hear if anyone here has tried something similar or has any advice on how to get started!
Thanks in advance!
TLDR: trying to load all Planetary Computer data for a specific region into lakehouse. Looking for best approachs
Solved! Go to Solution.
The Planetary Computer Data Catalog includes petabytes of environmental monitoring data, in consistent, analysis-ready formats. All of the datasets below can be accessed via Azure Blob Storage.
Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn
Note: "all" = "petabytes". Even if you use shortcuts this can easily overwhelm your budget.
The Planetary Computer Data Catalog includes petabytes of environmental monitoring data, in consistent, analysis-ready formats. All of the datasets below can be accessed via Azure Blob Storage.
Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn
Note: "all" = "petabytes". Even if you use shortcuts this can easily overwhelm your budget.
Hi @mild_heart ,
We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.
@lbendlin , thanks for your prompt response.
Thanks,
Prashanth Are
MS Fabric community support
If our super user response resolved your issue, please mark it as "Accept as solution" and click "Yes" if you found it helpful.
Hi @mild_heart ,
We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.
Thanks,
Prashanth Are
MS Fabric community support
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
If we donโt hear back, weโll go ahead and close this thread. For any further discussions or questions, please start a new thread in the Microsoft Fabric Community Forum weโll be happy to assist.
Thank you for being part of the Microsoft Fabric Community.