Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hi Team,
I am trying to get the adls path for a folder available in my container, but the thing is how can i get it directly from code snippets inside the notebook? Is there any particular keywords glossary is there assciated with each snippet, if yes where can I access that.
Thanks
Solved! Go to Solution.
Hi @ramankr48
To access code snippets for working with Azure Data Lake Storage (ADLS) in a Microsoft Fabric notebook, you can use the built-in snippet functionality. Hereโs how to access and use relevant snippets:
1. Open your Microsoft Fabric notebook.
2. Click on the โSnippetsโ button in the toolbar .
3. In the search bar, type keywords related to ADLS or file paths, such as โadlsโ, โstorageโ, or โpathโ.
4. Browse through the available snippets to find ones related to ADLS path handling.
If you donโt find specific snippets for your needs, you can create custom snippets by following these steps:
1. Click on the โManage code snippetsโ button in the Snippets panel.
2. Click โNew snippetโ to create a new custom snippet.
3. Enter a name, description, and the code for your snippet.
4. Save the snippet for future use.
some useful code snippets which might be useful :
from azure.storage.filedatalake import DataLakeServiceClient
from azure.identity import DefaultAzureCredential
account_name = "your_storage_account_name"
account_url = f"https://{account_name}.dfs.core.windows.net"
credential = DefaultAzureCredential()
service_client = DataLakeServiceClient(account_url, credential=credential)
file_system_client = service_client.get_file_system_client("your_container_name")
paths = file_system_client.get_paths(path="your_folder_path")
for path in paths:
print(path.name)
Please accept this solution and give kudos , if this is helpful
Thanks
Hi @ramankr48 ,
Thanks for posting in Microsoft Fabric Community,
Here is the Python Code Example to Get ADLS Path:
from azure.storage.blob import BlobServiceClient
connection_string = "<your connection string>"
container_name = "<your container name>"
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(container_name)
folders = set()
for blob in container_client.list_blobs():
folder_name = blob.name.split('/')[0]
if folder_name:
folders.add(folder_name)
storage_account_name = "<your storage account name>"
for folder in folders:
adls_path = f"abfss://{container_name}@{storage_account_name}.dfs.core.windows.net/{folder}/"
print(f"ADLS path for folder '{folder}': {adls_path}")
Here are some common keywords
adls or abfss : Used to specify Azure Data Lake Storage Gen2 paths in the notebook.
Example:
adls_path = "abfss://<container_name>@<storage_account_name>.dfs.core.windows.net/<folder_path>"
Hope these help.
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and a kudos would be appreciated.
Best Regards.
Hi @ramankr48
To access code snippets for working with Azure Data Lake Storage (ADLS) in a Microsoft Fabric notebook, you can use the built-in snippet functionality. Hereโs how to access and use relevant snippets:
1. Open your Microsoft Fabric notebook.
2. Click on the โSnippetsโ button in the toolbar .
3. In the search bar, type keywords related to ADLS or file paths, such as โadlsโ, โstorageโ, or โpathโ.
4. Browse through the available snippets to find ones related to ADLS path handling.
If you donโt find specific snippets for your needs, you can create custom snippets by following these steps:
1. Click on the โManage code snippetsโ button in the Snippets panel.
2. Click โNew snippetโ to create a new custom snippet.
3. Enter a name, description, and the code for your snippet.
4. Save the snippet for future use.
some useful code snippets which might be useful :
from azure.storage.filedatalake import DataLakeServiceClient
from azure.identity import DefaultAzureCredential
account_name = "your_storage_account_name"
account_url = f"https://{account_name}.dfs.core.windows.net"
credential = DefaultAzureCredential()
service_client = DataLakeServiceClient(account_url, credential=credential)
file_system_client = service_client.get_file_system_client("your_container_name")
paths = file_system_client.get_paths(path="your_folder_path")
for path in paths:
print(path.name)
Please accept this solution and give kudos , if this is helpful
Thanks