apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

User-specified node selector on pods #358

Closed ash211 closed 7 years ago

ash211 commented 7 years ago

I'd like to be able to add a Spark configuration to specify a node selector for all pods in the Spark job. This helps me select all pods from the job to do e.g. resource tracking per job.

mccheah commented 7 years ago

This continues to build on the need for custom pod specifications as mentioned in #38. I suppose the feature set and room for customization is possibly greater in Kubernetes compared to all of the other cluster managers that are supported so far. The original design that aligns with the other cluster managers may be too limiting here.

mccheah commented 7 years ago

Oh I think this is handled by #355