apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

Fail to launch spark-pi #590

Closed zjffdu closed 6 years ago

zjffdu commented 6 years ago

I use the following command to launch spark-pi example, but encounter ClassNotFound Exception. Do I miss anything ? Thanks

Error: Could not find or load main class org.apache.spark.examples.SparkPi
bin/spark-submit \
  --deploy-mode cluster \
  --class org.apache.spark.examples.SparkPi \
  --master k8s://https://192.168.99.100:8443 \
  --kubernetes-namespace default \
  --conf spark.executor.instances=5 \
  --conf spark.app.name=spark-pi \
  --conf spark.kubernetes.driver.docker.image=kubespark/spark-driver:v2.2.0-kubernetes-0.5.0 \
  --conf spark.kubernetes.executor.docker.image=kubespark/spark-executor:v2.2.0-kubernetes-0.5.0 \
  local:///Users/jzhang/github/spark-k8s/examples/target/scala-2.11/jars/spark-examples_2.11-2.2.0-k8s-0.5.0.jar
ifilonenko commented 6 years ago

Your local jar isn’t the same baked into the docker image. If you want to use a remote jar use the RSS. tl;dr it can’t find the jar in local:// within docker image

zjffdu commented 6 years ago

oops, it is the file in the docker image, I thought it is in the client host. local:///opt/spark/examples/jars/spark-examples_2.11-2.2.0-k8s-0.5.0.jar