and spark configurations are set at /usr/lib/spark/conf/spark-defaults.conf
where spark.executor.memory =19650M & spark.executor.cores = 5 & spark.executor.memoryOverhead =2184
In another case, I tried to set executor-per-cores at run-time using --executor-cores along with spark-submit...
I have installed spark-sql-perf using:
and spark configurations are set at /usr/lib/spark/conf/spark-defaults.conf where spark.executor.memory =19650M & spark.executor.cores = 5 & spark.executor.memoryOverhead =2184
In another case, I tried to set executor-per-cores at run-time using --executor-cores along with spark-submit...
Yet, in the YARN UI , I see this:
_** Container State: COMPLETE Mon Jun 21 06:12:54 +0000 2021 Elapsed Time: 7mins, 16sec Resource: 21856 Memory, 1 VCores
**_ And, there are 5 executors on each node, when there are 32 vCores.