Closed cjekal closed 5 years ago
I'm having the same issue.
gettyimages/spark:2.4.0-hadoop-3.0
works but the update to java 12 in 2.4.1 broke it.
Hello, Same here with pyspark typing :
df = sc.parallelize([1,2,3,4]) df.count()
Not a parquet issue but general issue i guess
I'll try out open jdk, since you now need a account to get java 8 jdk from oracle.
I tested both use cases on latest and did not get a error
i just pulled and stood up using docker-compose and i get
pyspark.sql.utils.IllegalArgumentException: 'Unsupported class file major version 55'
(in python obv)
Hello, Same here with pyspark typing :
df = sc.parallelize([1,2,3,4]) df.count()
Not a parquet issue but general issue i guess
what is the solution?
This is probably no longer relevant but I had this issue when I was using Java 11 with Scala 2.11.12. The solution is to downgrade to Java 8.
I still have the same issue, using the code
df = sc.parallelize([1, 2, 3, 4]) df.count()
this time the error is unsupported class file major version 56. My spark version is 2.4.4, with the default python version 2.7.16. My java version is 12.0.2 (came with my Mac OS system). My guess is my java version is too new for spark in this case? But how can I downgrade to Java 8? Will this mess up with my OS system? I'm using a local spark version directly on my Mac, so I think it's better that I can have the particular Java 8 just for spark but the newer Java for the rest of my system.
did you find a solution @lizzyhuang ?
I had the same problem
java.lang.IllegalArgumentException: Unsupported class file major version 57
using:
Windows, java 13.0.2, scala 2.11.12, spark 2.4.5
I fixed it by downgrading to jdk 1.8
which I got from adopttoopenjdk.net. Then I adjusted the PATH/JAVA_HOME environment variable and the error was gone.
did you find a solution @lizzyhuang ?
@Jaz-B no, I have a Mac so I don't really know which version of Java to download tbh....I just gave up and did some other stuff.
I had the same problem
java.lang.IllegalArgumentException: Unsupported class file major version 57
using:
Windows, java 13.0.2, scala 2.11.12, spark 2.4.5
I fixed it by downgrading to
jdk 1.8
which I got from adopttoopenjdk.net. Then I adjusted the PATH/JAVA_HOME environment variable and the error was gone.
Thank you. I'll try this on my Mac. Not sure whether I will need to change the path though...
I just recently pulled the latest using
gettyimages/spark:2.4.1-hadoop-3.0
and when I ran the following code, I received an obscurejava.lang.IllegalArgumentException: Unsupported class file major version 56
error. Here's the fullspark-shell
session below:Any ideas on what could have caused this? Is it b/c of the base imagec
debian:stretch
without any tag qualifier?