apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

Could not find or load main class SparkPi #632

Open iniyanp opened 6 years ago

iniyanp commented 6 years ago

Hi,

I was trying to run sparkpi application on k8s. when I use spark-submit to submit the spark job,

$ spark-2.3.1-bin-hadoop2.7/bin/spark-submit \
    --master k8s://<k8s-url> \
    --deploy-mode cluster \
    --name spark-pi \
    --class SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///vagrant/softwares/sparkpi_2.11-1.0.jar

I am getting the following exception.

exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java - Dspark.kubernetes.executor.podNamePrefix=spark-pi-d9a15eabd9a9360b863577bed1f247a4 -Dspark.app.id=spark-f0dc3926565f4cd59a48459b367a6460 -Dspark.master=k8s://<k8s-url> -Dspark.app.name=spark-pi -Dspark.kubernetes.container.image=spark-image -Dspark.driver.host=spark-pi-d9a15eabd9a9360b863577bed1f247a4-driver-svc.default.svc -Dspark.executor.instances=5 -Dspark.submit.deployMode=cluster -Dspark.driver.blockManager.port=7079 -Dspark.kubernetes.driver.pod.name=spark-pi-d9a15eabd9a9360b863577bed1f247a4-driver -Dspark.jars=<spark-pi-jar> -Dspark.driver.port=7078 -cp ':/opt/spark/jars/*:/vagrant/softwares/sparkpi_2.11-1.0.jar:/vagrant/softwares/sparkpi_2.11-1.0.jar' -Xms1g -Xmx1g -Dspark.driver.bindAddress=<ip> SparkPi

Error: Could not find or load main class SparkPi

Could anyone help me.

Thanks