Closed ulrikeme closed 5 years ago
Spark 2.4.x works with Scala 2.12, but you'd have to build Spark for Scala 2.12. Yeah, I think this is fine as it might be more common to encounter a Spark 2.4 deployment for 2.11, still. In that case just remove the two lines you changed.
Thanks for your comment. Of course your solution is the correct one. Don't you think it makes sense to push it to master, as other users are likely to run into it and the error message
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
is very frustrating indeed. It took me a while to figure out what's going wrong.
Well, the code is here really for reference and copy and paste into a shell. Scala version won't matter there. It's not very useful to compile and use it as an artifact. But I am agreeing with you. I'm asking for a simpler change to do the same thing.
This is what I mean: https://github.com/sryza/aas/commit/a5fad93f169e71811db7cb3457a0cf076412b12b
Excellent, thanks!
Running the ch03-recommender example via a maven built project using the spark-2.4 profile (
mvn -Pspark-2.4 package
), the following error will result when launched using spark-submit:Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
Spark downloads page says "Note that, Spark is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12."
Reverting to Scala 2.11 fixes the issue. Not sure this is the correct fix as I'm not very experienced in Spark yet. Works for me.