As a user I want to be able to pass spark properties, which might be the same over multiple sparkApplications (e.g. kerberos settings), via a configmap to the spark-submit command. Using this, I save effort and complexity repeatedly defining the same spark properties over and over when writing sparkApplications.
I tried to do some hacks to mock this behavior with mounting a spark-defaults.conf (which won't work because the spark-driver already mounts on /opt/spark/conf and thus I can't mount it there because the location is not unique) or passing a --properties-file (which also does not work because a --properties-file is already passed inherently)
As a user I want to be able to pass spark properties, which might be the same over multiple sparkApplications (e.g. kerberos settings), via a configmap to the spark-submit command. Using this, I save effort and complexity repeatedly defining the same spark properties over and over when writing sparkApplications.
I tried to do some hacks to mock this behavior with mounting a spark-defaults.conf (which won't work because the spark-driver already mounts on /opt/spark/conf and thus I can't mount it there because the location is not unique) or passing a --properties-file (which also does not work because a --properties-file is already passed inherently)