apache-spark-on-k8s / spark

Apache Spark enhanced with native Kubernetes scheduler back-end: NOTE this repository is being ARCHIVED as all new development for the kubernetes scheduler back-end is now on https://github.com/apache/spark/
https://spark.apache.org/
Apache License 2.0
612 stars 118 forks source link

Application names should support whitespaces and special characters #551

Open echarles opened 6 years ago

echarles commented 6 years ago

If I give --conf spark.app.name="spark pi" as parameter (a name with a white space), this exception is thrown ( Invalid value: "spark pi-1510599807833-driver": a DNS-1123 subdomain must consist of lower case alphanumeric characters) - I guess some code could replace all whitespaces with some hyphen to make DNS happy - To make it more robust, a more advance normalization (special characters...) should also be done.

Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: POST at: https://192.168.1.7:6443/api/v1/namespaces/default/pods. Message: Pod "spark pi-1510599807833-driver" is invalid: metadata.name: Invalid value: "spark pi-1510599807833-driver": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'). Received status: Status(apiVersion=v1, code=422, details=StatusDetails(causes=[StatusCause(field=metadata.name, message=Invalid value: "spark pi-1510599807833-driver": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), reason=FieldValueInvalid, additionalProperties={})], group=null, kind=Pod, name=spark pi-1510599807833-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=Pod "spark pi-1510599807833-driver" is invalid: metadata.name: Invalid value: "spark pi-1510599807833-driver": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), metadata=ListMeta(resourceVersion=null, selfLink=null, additionalProperties={}), reason=Invalid, status=Failure, additionalProperties={}).
    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:470)
    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.assertResponseCode(OperationSupport.java:409)
    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:379)
    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:343)
ash211 commented 6 years ago

Agreed, we should probably do some stronger normalization to ensure using app names as kubernetes resources doesn't cause characters to be reused in places where they're invalid