Closed samvantran closed 5 years ago
Unfortunately @akirillov mapping --jars <urls>
to --conf spark.jars=<urls>
does not appear to work. Seems like the extra jar was not downloaded nor added to the class path. You can see in this test run I tried (note Class path entries
is empty) and the job failed to find the class I specified.
I think spark.jars.packages
works bc Spark source will attempt to search in maven central repos whereas spark.jars
does not seem to work as advertised. Perhaps we'll need to stick with the workaround.
Thanks @akirillov. Yes this is for a bug fix - we can explore fixing spark source code in the future since we discovered spark.jars
is blacklisted from conf creation
What changes were proposed in this pull request?
Resolves DCOS-45376 && DCOS-33746
This updates the Spark CLI to support the
--jars
flag and was inspired by this workaround.Note: this also fixes a bug in the CLI where passing
--jars
would result in incorrect parsing ofsubmit-args
and the command would fail to submit. This should be backported to both2.4.0
and2.5.0
How were these changes tested?
--conf
flags are added as part ofsubmit-args
--jars
with an app missing a Java classRelease Notes
--jars
--jars
would incorrectly parse and fail to submit