kubeflow / spark-operator

Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.
Apache License 2.0
2.79k stars 1.38k forks source link

Running different version of spark jobs on a spark-operator cluster #1163

Closed vvavepacket closed 1 week ago

vvavepacket commented 3 years ago

Is it possible to run multiple spark jobs/spark applications (each having their own version, i.e. SparkApplication 1 uses Spark 2.4 and SparkApplication 2 uses Spark 3.0.0) in a single namespace?

So I will be creating different sparkapplications and submit them to k8s. Will it cause any issue or is it totally fine?

jkleckner commented 3 years ago

Yes, just use the appropriate image for your application. Personally, I build my own spark 2.4.x to get fixes that haven't been released yet such as https://github.com/apache/spark/pull/30283

github-actions[bot] commented 4 weeks ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

github-actions[bot] commented 1 week ago

This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.