apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

java.lang.ClassNotFoundException: org.apache.spark.deploy.kubernetes.submit.Client #411

Open fogongzi opened 7 years ago

fogongzi commented 7 years ago

When I test spark examples on kubernetes, it gives error: java.lang.ClassNotFoundException: org.apache.spark.deploy.kubernetes.submit.Client at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:229) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:723) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:188) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:213) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:127) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

and my run scripts is as follows: ./bin/spark-submit \

--deploy-mode cluster \ --class org.apache.spark.examples.SparkPi \ --master k8s://https://10.71.156.239:6443 \ --kubernetes-namespace spark-cluster \ --conf spark.executor.instances=5 \ --conf spark.app.name=spark-pi \ --conf spark.kubernetes.driver.docker.image=10.71.156.242/spark/spark-driver:latest \ --conf spark.kubernetes.executor.docker.image=10.71.156.242/spark/spark-executor:latest \ --conf spark.kubernetes.initcontainer.docker.image=10.71.156.242/spark/spark-init:latest \ local:///workspace/scala/spark-2.1.0-kubernetes-0.3.0/dist/examples/jars/spark-examples_2.11-2.1.0-k8s-0.3.0-SNAPSHOT.jar

so what is the problem,thanks!