Closed tmckayus closed 7 years ago
@sahilprasad @mccheah @foxish ptal
LGTM.
Thank you for this! @ssuchter thoughts?
Link to some back-story for reference: https://github.com/apache-spark-on-k8s/spark/pull/464#issuecomment-341739680
@erikerlandson @tmckayus Looks great! Good to have this resolved in a cleaner way than my PR (btw, feel free to close).
LGTM, merging EOD unless someone has objections.
Would it be possible to push the built image to Docker Hub as well? I can see that the current image doesn't contain this fix.
Quotes around $SPARK_CLASSPATH in the java invocations specified in the Dockerfiles correctly prevent the shell from expanding wildcard paths in cases where the classpath is a single value like /opt/spark/jars/*
What changes were proposed in this pull request?
Each place where $SPARK_CLASSPATH is used as the value of the -cp flag to ${JAVA_HOME}/bin/java in the Dockerfiles under resource-managers/kubernetes/docker-minimal-bundle/src/main/docker has been changed to include double quotes around $SPARK_CLASSPATH
How was this patch tested?
Manually. Without the change, standard instructions for running a python example fail when the unnecessary --jars flag is left off. With the change, it works correctly.
bin/spark-submit \ --deploy-mode cluster \ --master k8s://https://: \
--kubernetes-namespace \
--conf spark.executor.instances=5 \
--conf spark.app.name=spark-pi \
--conf spark.kubernetes.driver.docker.image=kubespark/spark-driver-py:v2.2.0-kubernetes-0.5.0 \
--conf spark.kubernetes.executor.docker.image=kubespark/spark-executor-py:v2.2.0-kubernetes-0.5.0 \
local:///opt/spark/examples/src/main/python/pi.py 10
Error from the spark driver (could be any jar, depends on order of expansion by the shell) Error: Could not find or load main class .opt.spark.jars.RoaringBitmap-0.5.11.jar
The other images were not tested explicitly, but it's the same case.