radanalyticsio / spark-operator

Operator for managing the Spark clusters on Kubernetes and OpenShift.
Apache License 2.0
157 stars 61 forks source link

serviceAccountName: spark-crd-operator is not used by SparkApplication #267

Open kevinyu98 opened 4 years ago

kevinyu98 commented 4 years ago

Created Spark Operator with OLM, it will create this serviceaccount: spark-crd-operator. Then try to run spark application without specify the serviceaccount, this operator will try to get default one which is spark-operator. This will cause the application pod can't be created.

Description:

I deployed the operator through OLM console, then try to run the example from the OLM console.

Screen Shot 2019-12-06 at 8 20 34 AM

After create, there is no pod created. from the oc get events oc get events LAST SEEN FIRST SEEN COUNT NAME KIND SUBOBJECT TYPE REASON SOURCE MESSAGE 8m 2h 39 my-spark-app-submitter.15dda9bcc532da58 ReplicationController Warning FailedCreate replication-controller Error creating: pods "my-spark-app-submitter-" is forbidden: error looking up service account operators/spark-operator: serviceaccount "spark-operator" not found here is my sa oc get serviceaccount NAME SECRETS AGE builder 2 16d default 2 16d deployer 2 16d spark-crd-operator 2 9d

From the operator code, it seems if the spark application didn't provide the serviceaccount, it will use the default one, which spark-operator. Should we change the code in manifest/olm/crd/sparkclusteroperator.1.0.1.clusterserviceversion.yaml to use spark-operator. In the olm configmap based yaml file, it is using the spark-operator. manifest/olm/configmap-based-all-in-one-csv.yaml

Steps to reproduce:

  1. deploy the spark operator through olm
  2. create SparkApplication without specifying the serviceAccount
  3. the spark application pod will not up
  4. check kubectl/oc get events
elmiko commented 4 years ago

iirc the spark-crd-operator service account is only for the actual operator instance. the spark-operator service account is used for the actions of deploying pods in your user project.

i have not tried to reproduce this yet, but i will try to take a look. thanks for reporting it!

elmiko commented 4 years ago

looking at your screenshot, it appears you are trying to run the spark application in a namespace called operators. i would double check to make sure that is the proper namespace, and that you have sufficient privileges to create these resources there.