Is this a common thing to fight with? I'm having to be very specific about certain dependencies in my app on submission/runtime. Been getting around this with some shading plugins in my build tools. Some FQN's as an example:
Another thing to note is that a local cluster using spark-2.1.0-bin-hadoop2.7 doesn't really run into this.
Both environments are roughly executed in the same way:
./spark/bin/spark-submit --class com.some.App --master local[1] --verbose project/build/libs/all.jar
Still going down this rabbit hole to get things just right - the amount of effort is substantial. Something feels wrong here.
Is this a common thing to fight with? I'm having to be very specific about certain dependencies in my app on submission/runtime. Been getting around this with some shading plugins in my build tools. Some FQN's as an example:
Another thing to note is that a local cluster using spark-2.1.0-bin-hadoop2.7 doesn't really run into this.
Both environments are roughly executed in the same way:
./spark/bin/spark-submit --class com.some.App --master local[1] --verbose project/build/libs/all.jar
Still going down this rabbit hole to get things just right - the amount of effort is substantial. Something feels wrong here.