Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
yllsuarez76
New Contributor III

Copy job, Lakehouse changing ids causing code to break.

I am working on my feature branch using a copyjob to push data from a remote SQL into a lakehouse.

 

Once I merge my code to the main branch and open the production workspace the copyjob is broke, for obvious reasons, the lakehouse id changed.

 

Question is, how to mantain the code for jumping between branches and not breaking during merges.

3 REPLIES 3
spencer_sa
Contributor III

We maintain a table of workspace/lakehouse ids by name and environment and look these up at pipeline run time.
You can automate the creation of this table using sempy-labs, the API, and/or notebookutils.lakehouse.get.
Technically you could also dynamically look these up in a notebook that is ran before any other job that outputs in its exitValue the list of lakehouses and ids.

Anonymous
Not applicable

HI @yllsuarez76,

AFAIK, current it seems not support these feature, perhaps you can submit an idea to improve the 'copy job' usage:

Microsoft Fabric Ideas

Regards

Xiaoxin Sheng

Anonymous
Not applicable

Hi @yllsuarez76 ,

Did the above suggestions help with your scenario? if that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Users online (27)