Is there any way to provide specific configuration for a specific SparkApplication (I'm talking about configuration parameters that are passed in using --conf when calling spark-submit from the command line). Examples include:
spark.sql.parquetFilterPushdown=true
spark.sql.execution.arrow=true
spark.default.parallelism=
etc.
I see its possible at the SparkCluster level (see here) but what about the SparkApplication level?
I noticed there was a "sparkConfigMap" parameter at the SparkApplication level. Would this allow for me to accomplish the above? If so, I haven't found much information on how to use it. If this is the case, I would appreciate an example.
Description:
Is there any way to provide specific configuration for a specific SparkApplication (I'm talking about configuration parameters that are passed in using --conf when calling spark-submit from the command line). Examples include:
spark.sql.parquetFilterPushdown=true spark.sql.execution.arrow=true spark.default.parallelism=
etc.
I see its possible at the SparkCluster level (see here) but what about the SparkApplication level?
I noticed there was a "sparkConfigMap" parameter at the SparkApplication level. Would this allow for me to accomplish the above? If so, I haven't found much information on how to use it. If this is the case, I would appreciate an example.