Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Kar_c
New Contributor

Concurrent spark sessions

Hi, We have a fabric capacity with F8 SKU

(medium, memory optimised), we are facing too many sessions issue very often while using spark notebooks or running pipelines. Also, if two pipelines run together both are taking so long to complete. How my concurrent spark sessions can we have with F8 SKU.

 

What can we do to handle this scenario?

 

 

 

1 ACCEPTED SOLUTION
nilendraFabric
Honored Contributor

Hello @Kar_c 


โ€ข F8 SKU provides:
โ€ข 16 Spark vCores (base).
โ€ข Burst multiplier: 3x โ†’ up to 48 Spark vCores.
โ€ข Each Medium Starter Pool session (default) uses 16 vCores (2 nodes ร— 8 vCores/node).
โ€ข Without bursting: Only 1 session can run (16 vCores / 16 vCores per session).
โ€ข With bursting: Up to 3 sessions (48 vCores / 16 vCores per session

 


 If capacity is exceeded:
โ€ข Interactive notebooks: Fail with โ€œToo Many Requests For Capacityโ€ errors.
โ€ข Scheduled jobs (via Pipelines/Scheduler): Queue up to 8 jobs

 

Each pipeline activity starts a separate session. For example:
โ€ข 2 pipelines running 2 notebooks each = 4 sessions (exceeds F8โ€™s 3-session limit with bursting).

 

 

try implementing these for better concurrency 

 

Use High Concurrency Mode
โ€ข Allows up to 5 notebooks to share a single Spark session

 

Leverage `NotebookUtils.RunMultiple`
โ€ข Run multiple notebooks in parallel within a single session to avoid capacity limits

Single-node Spark Pool:
โ€ข Configure autoscale to 1 node โ†’ 8 vCores per session.
โ€ข F8 can run 6 concurrent sessions (48 vCores / 8 vCores per session)
Custom Small Pool:
โ€ข Half the size of Medium โ†’ 4 vCores per session.

 

 

 

View solution in original post

5 REPLIES 5
nilendraFabric
Honored Contributor

Hello @Kar_c 


โ€ข F8 SKU provides:
โ€ข 16 Spark vCores (base).
โ€ข Burst multiplier: 3x โ†’ up to 48 Spark vCores.
โ€ข Each Medium Starter Pool session (default) uses 16 vCores (2 nodes ร— 8 vCores/node).
โ€ข Without bursting: Only 1 session can run (16 vCores / 16 vCores per session).
โ€ข With bursting: Up to 3 sessions (48 vCores / 16 vCores per session

 


 If capacity is exceeded:
โ€ข Interactive notebooks: Fail with โ€œToo Many Requests For Capacityโ€ errors.
โ€ข Scheduled jobs (via Pipelines/Scheduler): Queue up to 8 jobs

 

Each pipeline activity starts a separate session. For example:
โ€ข 2 pipelines running 2 notebooks each = 4 sessions (exceeds F8โ€™s 3-session limit with bursting).

 

 

try implementing these for better concurrency 

 

Use High Concurrency Mode
โ€ข Allows up to 5 notebooks to share a single Spark session

 

Leverage `NotebookUtils.RunMultiple`
โ€ข Run multiple notebooks in parallel within a single session to avoid capacity limits

Single-node Spark Pool:
โ€ข Configure autoscale to 1 node โ†’ 8 vCores per session.
โ€ข F8 can run 6 concurrent sessions (48 vCores / 8 vCores per session)
Custom Small Pool:
โ€ข Half the size of Medium โ†’ 4 vCores per session.

 

 

 

v-saisrao-msft
Honored Contributor II

Hi @Kar_c,

Thank you for reaching out to the Microsoft Fabric Forum Community.

 

Thank you for your question. I wanted to check if you had a chance to review the information provided by @nilendraFabric,on optimizing Spark concurrency. Additionally, I am also including Microsoft documentation which might help you understand better.

Concurrency limits and queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn

 

If this post helps, please give us โ€˜Kudosโ€™ and consider accepting it as a solution to help others find it more easily.

 

Thank you

v-saisrao-msft
Honored Contributor II

Hi @Kar_c,


I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

v-saisrao-msft
Honored Contributor II

Hi @Kar_c,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

v-saisrao-msft
Honored Contributor II

Hi @Kar_c,

 

We havenโ€™t heard back from you regarding your issue. If it has been resolved, please mark the helpful response as the solution and give a โ€˜Kudosโ€™ to assist others. If you still need support, let us know.

 

Thank you.

Helpful resources

Announcements
Users online (27)