Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I have a pipeline that runs 4 notebook tasks, using the High concurrency feature for running multiple notebooks. I set the task timeout to "0.02:00:00".
However, since August 1, 2025, I have encountered around 4โ5 cases where the Spark session gets stuck in a running state (e.g., 39 hours) and does not force stop after the 2-hour timeout as expected.
Has anyone experienced the same issue, or is there a known workaround/fix?
Thank you in advance for your support and guidance!
Solved! Go to Solution.
Hi @v-hashadapu , I'm still monitoring it. I don't have this issue anymore now, but I'm not sure if I will have it again. It's still worth monitoring, I know Fabric is still in development and improvement, so I hope this issue will be fixed to make it more stable.
@nguyenhieubis : Looks it is a known issue in Fabric especially in high concurrency run time in spark. The task time out control only at orchestration layer to wait for a notebook to complete. It does not forcefully terminate the spark session. You can try to set the explicit timeout inside the notebook itself.
Thanks !
Hi @nguyenhieubis , hope you are doing great. May we know if your issue is solved or if you are still experiencing difficulties. Please share the details as it will help the community, especially others with similar issues.
Hi @v-hashadapu , I'm still monitoring it. I don't have this issue anymore now, but I'm not sure if I will have it again. It's still worth monitoring, I know Fabric is still in development and improvement, so I hope this issue will be fixed to make it more stable.
Hi @nguyenhieubis , Thanks for the sharing the information here. I hope the issue is cleared permanently for you. If you have any other queries, please feel free to raise a new post in the community. We are always happy to help.
Hi @nguyenhieubis , hope you are doing well. Just wanted to check on your issue to see if it came back or if everything is going well. Please let us know.
Thank you.