Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
SooryaP
New Contributor

concurrency lock condition error when writing multiple records to an audit log table simultaneously.

  • Issue: Facing lock condition errors when writing multiple records to an audit log table simultaneously.
  • Setup:
    • One master pipeline invoking multiple child pipelines.
    • Audit log notebook placed in both master and child pipelines.
    • During a child pipeline failure, the notebook activity updates the audit log table in parallel for both child and master pipeline rows.
  • Problem: Despite updates being made as separate rows, lock conditions occur, causing pipeline errors.
  • Attempts to Resolve:
    • Applied partition by.
    • Used update statements.
    • These methods did not resolve the issue in Fabric.

Framed Question:

What are some effective strategies or best practices to avoid lock condition errors when writing multiple records to an audit log table simultaneously in a pipeline setup, especially in a Fabric environment?



2 REPLIES 2
AndyDDC
Valued Contributor

Hi @SooryaP are you using the Warehouse service?  It uses snapshot isolation and has only table level locking. Concurrent UPDATE statements will likely cause issues. I explain further here:

 

https://www.serverlesssql.com/transaction-isolation-in-fabric-warehouses/

Anonymous
Not applicable

Hi @SooryaP 

Have you resolved this issue? If any of the answers provided were helpful, please consider accepting them as a solution. If you have found other solutions, we would greatly appreciate it if you could share them with us. 

 

Best Regards,
Jing

Helpful resources

Announcements
Users online (25)