streamnative / pulsar-spark

Spark Connector to read and write with Pulsar
Apache License 2.0
112 stars 50 forks source link

[BUG] Exception in thread "main" java.lang.NoClassDefFoundError: scala/$less$colon$less #177

Open vaibhavsw opened 1 month ago

vaibhavsw commented 1 month ago

I am using streamnative's Apache Pulsar connector version 3.4.4.1 with spark dependencies version 3.4.1. When submitting the spark app on console, I am getting below error.

Exception in thread "main" java.lang.NoClassDefFoundError: scala/$less$colon$less
        at org.apache.spark.sql.pulsar.PulsarProvider.sourceSchema(PulsarProvider.scala:58)
        at org.apache.spark.sql.execution.datasources.DataSource.sourceSchema(DataSource.scala:233)
        at org.apache.spark.sql.execution.datasources.DataSource.sourceInfo$lzycompute(DataSource.scala:118)
        at org.apache.spark.sql.execution.datasources.DataSource.sourceInfo(DataSource.scala:118)
        at org.apache.spark.sql.execution.streaming.StreamingRelation$.apply(StreamingRelation.scala:35)
        at org.apache.spark.sql.streaming.DataStreamReader.loadInternal(DataStreamReader.scala:168)
        at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:144)
        at com.pinelabs.spark.pulsar.CardTransactionStreamProcessor.main(CardTransactionStreamProcessor.java:44)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.$less$colon$less
        at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:445)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:592)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
        ... 20 more

I am getting the same error when using the 3.5.2 version of Spark, which according to documentation, should work.

vaibhavsw commented 1 month ago

Any help on this?