Bingi Nagesh
Aug 20, 2022

--

Hi Tharun

Thank you for the explanation. I have a doubt on spark.dynamicAllocation.maxExecutors. Does the config apply to entire spark cluster or just to the spark step we are running? I am running spark jobs in aws emr with concurrency of 6. Each step requires 10 executors with each executor having 4 cores. So, for concurrency of 6, I need 240 executor cores (6*4*10). Suppose if I give spark.dynamicAllocation.maxExecutors=12, does yarn spun 12 executors for each step or 12 executors for all 6 concurrent steps?

Thanks in advance

--

--

No responses yet