Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
@coderanger, would be great if you could help rebase this entire fork on top of the spark upstream effort - then we'd be in a better position to use this PR - since the dockerfiles etc are now very different;
This allows setting things like HADOOP_CONF_DIR in the more traditional Spark way.
What changes were proposed in this pull request?
Adds a
source "${SPARK_HOME}/bin/load-spark-env.sh"
to the command in each non-spark-class container.How was this patch tested?
Manual testing with my local development environment.