Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
CeeVee33
New Contributor III

How to get datepart in Expressions

Hi Experts - I want to source out the date, month and year part of today's date. How can I achieve it while creating a dynamic content for my csv folder path and filename?

 

CSVs are saved in 

\\Servername\ABCFolder\2025\JANUARY\02.01.2025\MyFirstFav_02012025.csv

\\Servername\ABCFolder\2025\JANUARY\02.01.2025\MySecondFav_02012025.csv

etc

 

They all have same structure.

 

I intend to create a pipeline to ingest them initially,once off load, and then daily. For daily, it needs to keep changing the filename based on date.

 

Also, if there is a tutorial to ingest multiple files into a table using data pipeline, please let me know. I could find tutorial on direct upload to Lakehouse, or a pipeline upload to lakehouse as files, and not table.

 

Any help is much appreciated.

 

1 ACCEPTED SOLUTION
v-hashadapu
Honored Contributor II

Hi @CeeVee33 , Thank you for reaching out to the Microsoft Community Forum.

  1. To dynamically create the folder path and filenames based on today's date in Microsoft Fabric Pipelines, you can use the following expressions directly in the pipeline.

Folder path: @concat('\\Servername\ABCFolder\',

         formatDateTime(utcNow(), 'yyyy'), '\\',

         formatDateTime(utcNow(), 'MMMM').ToUpper(), '\\',

         formatDateTime(utcNow(), 'dd.MM.yyyy'))

Output: \\Servername\ABCFolder\2025\MARCH\20.03.2025

               Example File1: @concat('MyFirstFav_', formatDateTime(utcNow(), 'ddMMyyyy'),                '.csv')

Output: MyFirstFav_20032025.csv

Example File2: @concat('MySecondFav_', formatDateTime(utcNow(), 'ddMMyyyy'), '.csv')

Output: MySecondFav_20032025.csv

  1. To Ingest Multiple CSVs into a Table in Fabric, create a Pipeline, use a Get Metadata activity to list the files in the dynamic folder path. Use a ForEach Activity, Iterate over the file list. Copy Data into a Table, Use a wildcard file path: \\Servername\ABCFolder\@{formatDateTime(utcNow(), 'yyyy')}\@{formatDateTime(utcNow(), 'MMMM').ToUpper()}\@{formatDateTime(utcNow(), 'dd.MM.yyyy')}\*.csv . Configure the Sink, point to your Lakehouse table, Use Auto Mapping to map CSV columns to table columns. Add a Trigger, set the pipeline to run daily.
  2. Try these learn documents:

Module 1: Create a pipeline with Data Factory

Lakehouse tutorial: Ingest data into the lakehouse

Ingest data into your Warehouse using data pipelines

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

View solution in original post

5 REPLIES 5
v-hashadapu
Honored Contributor II

Hi @CeeVee33 , Thank you for reaching out to the Microsoft Community Forum.

  1. To dynamically create the folder path and filenames based on today's date in Microsoft Fabric Pipelines, you can use the following expressions directly in the pipeline.

Folder path: @concat('\\Servername\ABCFolder\',

         formatDateTime(utcNow(), 'yyyy'), '\\',

         formatDateTime(utcNow(), 'MMMM').ToUpper(), '\\',

         formatDateTime(utcNow(), 'dd.MM.yyyy'))

Output: \\Servername\ABCFolder\2025\MARCH\20.03.2025

               Example File1: @concat('MyFirstFav_', formatDateTime(utcNow(), 'ddMMyyyy'),                '.csv')

Output: MyFirstFav_20032025.csv

Example File2: @concat('MySecondFav_', formatDateTime(utcNow(), 'ddMMyyyy'), '.csv')

Output: MySecondFav_20032025.csv

  1. To Ingest Multiple CSVs into a Table in Fabric, create a Pipeline, use a Get Metadata activity to list the files in the dynamic folder path. Use a ForEach Activity, Iterate over the file list. Copy Data into a Table, Use a wildcard file path: \\Servername\ABCFolder\@{formatDateTime(utcNow(), 'yyyy')}\@{formatDateTime(utcNow(), 'MMMM').ToUpper()}\@{formatDateTime(utcNow(), 'dd.MM.yyyy')}\*.csv . Configure the Sink, point to your Lakehouse table, Use Auto Mapping to map CSV columns to table columns. Add a Trigger, set the pipeline to run daily.
  2. Try these learn documents:

Module 1: Create a pipeline with Data Factory

Lakehouse tutorial: Ingest data into the lakehouse

Ingest data into your Warehouse using data pipelines

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

CeeVee33
New Contributor III

That is very detailed explanation, @v-hashadapu . I really appreciate. 

 

I have few other unplanned things to execute today, but I will get to your solution on Monday and come back to you.

 

Thanks

 

v-hashadapu
Honored Contributor II

Hi @CeeVee33 , Thanks for the response. Happy to wait. 

v-hashadapu
Honored Contributor II

Hi @CeeVee33 , 
Please let us know if your issue is solved. If it is, consider marking the answers that helped 'Accept as Solution', so others with similar queries can find them easily. If not, please share the details.
Thank you.

CeeVee33
New Contributor III

Hi @v-hashadapu - thank you very much for following up.

 

Yes, your suggestions worked like a charm. Very smoothly I got the files loaded.

 

Thank you again

Helpful resources

Announcements
Users online (25)