Open enaggar opened 6 years ago
Thanks for raising this issue. I will check and revert back shortly. Are you using dynamic allocation / autoscaling of executors?
Getting the same error when runing on Databricks notebook
EfficiencyStatisticsAnalyzer, StageSkewAnalyzer both throw this error in a jupyter notebook and it seems to have the same cause. AppContext.getMaxConcurrent, the maxConcurrent never get's higher that 0 in those cases. We don't use dynamic memory allocation of the executors.
@iamrohit Thanks for fixing this ;)
I'm receiving this error running sparklens on spark history file
In the output I can see total number of cores available = 10, and total number of executors = 11, what could be the cause of this? This leads to the executorCores variables to be equal to zero, which leads to the issue above.