Currently you declare the arguments to a Hadoop DSL SparkJob using "appParams" where you just list the job arguments directly.
Another way to do this would be to support "namedAppParams" that takes a list of keys for job properties that you have already set on the job and looks up their corresponding values as the values to use for the job argument. This can thrown an error if you haven't declared the given key for the job.
Currently you declare the arguments to a Hadoop DSL SparkJob using "appParams" where you just list the job arguments directly.
Another way to do this would be to support "namedAppParams" that takes a list of keys for job properties that you have already set on the job and looks up their corresponding values as the values to use for the job argument. This can thrown an error if you haven't declared the given key for the job.