radanalyticsio / spark-operator

Operator for managing the Spark clusters on Kubernetes and OpenShift.
Apache License 2.0
157 stars 61 forks source link

Ability to have a specific configuration for a SparkApplication #298

Open wcallag3 opened 4 years ago

wcallag3 commented 4 years ago

Description:

Is there any way to provide specific configuration for a specific SparkApplication (I'm talking about configuration parameters that are passed in using --conf when calling spark-submit from the command line). Examples include:

spark.sql.parquetFilterPushdown=true spark.sql.execution.arrow=true spark.default.parallelism= etc.

I see its possible at the SparkCluster level (see here) but what about the SparkApplication level?

I noticed there was a "sparkConfigMap" parameter at the SparkApplication level. Would this allow for me to accomplish the above? If so, I haven't found much information on how to use it. If this is the case, I would appreciate an example.