ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: External scheduler cannot be instantiated
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:3043)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:568)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Unknown Source)
at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Unknown Source)
at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.makeExecutorPodsAllocator(KubernetesClusterManager.scala:185)
at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:139)
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:3037)
... 15 more
i think main problem is "ERROR SparkContext: Error initializing SparkContext." here
Has anyone solved it after experiencing a related error?
I don't know what I did wrong!!
ps.
I already changed serviceaccount to sparkoperator-spark-operator and my own serviceaccount.
i think it is not related to rbac problem.
I also already try change image to official spark image.
This issue has been automatically marked as stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days. Thank you for your contributions.
Hi there. i meet serious problem now
I installed with following commands:
And I made my own docker image using the basic file that I have when I download Spark with following command:
And apply the yaml file to the spark operator through the yaml file that provided kubeflow/examples/spark-py-pi. Yaml only changed the docker image.
I used serviceaccount sparkoperator-spark that was created when the spark-operator was installed by helm.
my using yaml file
but error occured. error log below
i think main problem is "ERROR SparkContext: Error initializing SparkContext." here
I don't know what I did wrong!!
ps. I already changed serviceaccount to sparkoperator-spark-operator and my own serviceaccount. i think it is not related to rbac problem.
I also already try change image to official spark image.
plz give me solution guys plz.