apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

How to kill a running spark job? #436

Closed leletan closed 7 years ago

leletan commented 7 years ago

On a standalone cluster we used to kill a running spark job by submission_id, which I am not sure if supported here. Or any work around to achieve the same thing?

mccheah commented 7 years ago

Use kubectl to delete the driver pod and that should also delete all of the objects the application depends on.

leletan commented 7 years ago

Thanks.

mccheah commented 7 years ago

In the future when we have a Spark CustomResource we should theoretically be able to use the Kubernetes dashboard to destroy Spark applications.