Open bjornjorgensen opened 2 years ago
I've recently been having issues using Spark (3.x IIRC) with Java 17 (some Hadoop classes failing in a weird way, even when not using Hadoop as a cluster), in other projects of mine.
I might try to use it here, but I can't be sure it's gonna work fine.
Hi, we are trying to target Spark 3.3.x or 3.4.x + with Scala 2.13. Java8 and up. 2.13 is the main target so we can begin cross compiling from Scala 3. So the big question is.... is this on your roadmap? Happy to help if pointed in the right direction.
@tculshaw It's very likely things already work this way. I've been experimenting fine with Spark 3.3 (with Scala 2.12 though) support for a few days on some large notebook infrastructure. Manual tests from Scala 2.13 also work fine.
I didn't try with Java 17 though, only Java 8.
(I just updated the README for Spark 3.3 and Scala 2.13 support.)
This works even with Spark 3.5 and Scala 2.13.12 through the almond kernel
I created a dockerfile that uses Spark 3.5.1, Scala 2.13, and Java 17. This is derived from the jupyter docker-stacks image.
Should see a PR to the docker-stacks shortly
Hi, we are in the progress to upgrade Spark to 3.3.0 with Scala 2.13 and java 17 at jupyter docker-stack
We have some problems with the old and not updated spylon-kernel. So we are in the process of either removing Scala from the images or we can find a replacement.
Yours project have for 10 days ago add support for Spark 3.2.0
The question now is will there be a lot of work to get support for Spark 3.3.0 with Scala 2.13 and java 17?