Gelerion / spark-sketches

Integrating probabilistic algorithms into Spark using DataSketches
MIT License
8 stars 2 forks source link

java.lang.NoClassDefFoundError: scala/Serializable #6

Open saartamir opened 1 year ago

saartamir commented 1 year ago

Hello, I'm using Spark 3.4.1, Scala 2.13.11, java 11 and import: "com.gelerion.spark.sketches" % "spark-sketches" % "1.0.0"

I'm getting this error: Exception in thread "main" java.lang.NoClassDefFoundError: scala/Serializable at java.base/java.lang.ClassLoader.defineClass1(Native Method) at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1017) at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174) at java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:800) at java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:698) at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:621) at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:579) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) at org.apache.spark.sql.registrar.SketchFunctionsRegistrar$.expressions(SketchFunctionsRegistrar.scala:12) at org.apache.spark.sql.registrar.FunctionsRegistrar.registerFunctions(FunctionsRegistrar.scala:25) at org.apache.spark.sql.registrar.FunctionsRegistrar.registerFunctions$(FunctionsRegistrar.scala:24) at org.apache.spark.sql.registrar.SketchFunctionsRegistrar$.registerFunctions(SketchFunctionsRegistrar.scala:9) at org.apache.spark.sql.registrar.FunctionsRegistrar.registerFunctions(FunctionsRegistrar.scala:29) at org.apache.spark.sql.registrar.FunctionsRegistrar.registerFunctions$(FunctionsRegistrar.scala:28) at org.apache.spark.sql.registrar.SketchFunctionsRegistrar$.registerFunctions(SketchFunctionsRegistrar.scala:9) at io.anzu.spark.poc.Consumer$.main(Consumer.scala:20) at io.anzu.spark.poc.Consumer.main(Consumer.scala)

When using spark-sql expression: theta_sketch_build(IF(type='start', udid, NULL)) app_uniq_users

Can you tell me what I'm doing wrong?

Gelerion commented 1 year ago

Hi @saartamir, Scala 2.13 is currently not supported -- I am planning to add support for it in the next few weeks.