Closed zartstrom closed 2 years ago
So, if this is what I think it is: a pretty common issue with Spark, and not decline
-specific. Briefly: Spark includes its own version of cats on the classpath, which is being used in preference to whatever decline is pulling in. You could try and confirm this by checking which version of cats is included in the spark distro, and which version your build depends on, and checking whether they're supposed to be binary compatible.
Assuming this is your issue, a few ways to fix it:
spark.driver.userClassPathFirst=true
in your spark config.decline
that happens to use a version of cats that's close enough to Spark's to dodge the issue.1 is the easy option, though in my experience most complex apps end up wanting 3 sooner or later!
Thanks for the hints @bkirwi, I went for option 3 and it works just fine.
assembly / assemblyShadeRules := Seq(
ShadeRule.rename("cats.**" -> "repackaged.cats.@1").inAll
)
Hi there, I have trouble using decline within a fat jar in spark. I posted a question on StackOverflow and want to mention it here as well, I hope it is relevant to others. It might be that the issue is not decline specific, sorry in advance.
This is my github repo to reproduce the error.
I want to use decline to parse command line parameters for a spark application. I use
sbt assembly
to create a fat jar and use it in thespark-submit
. Unfortunately I get an errorjava.lang.NoSuchMethodError: cats.kernel.Semigroup$.catsKernelMonoidForList()Lcats/kernel/Monoid;
when the parameters get parsed (example below).This is my code:
and this is my
build.sbt
:This is how I reproduce the error locally (spark version is
3.1.2
)Interestingly adding the assembled jar to the scala classpath does not yield the same error but gives the expected help message. My local scala version is
2.12.16
and the spark scala version is2.12.10
, but I'm unsure whether this can be the cause.I also tried scala
2.13
with spark3.2.2
and I got the same error, although need to double check on that. What could I be missing?