Closed m-evazzadeh closed 4 years ago
java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.reader.SupportsScanUnsafeRow
looks like you may be missing or have incompatible jars.
Looking at your log, you are using Spark 2.4.1
. However the kafka jar you are using is org.apache.spark_spark-sql-kafka-0-10_2.11-2.3.2.jar
which I think is supposed to be used with Spark 2.3.2
. What was the spark-submit
command you used ? If you aren't using it already, I would advise using the --packages
option with spark-submit
. Please try out the suggested in #341 as well.
@m-evazzadeh was this able to solve your issue ?
Hi
it worked with change command --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.2 to --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.1
thanks @suhsteve
I am using Apache Spark in .Net Core
I'm trying to connect Spark Streaming with Kafka, when I run my application I get the below errors.
----------------------------my source code:
-----------------------Log: Ivy Default Cache set to: C:\Users\MyUserAccount.ivy2\cache The jars for the packages stored in: C:\Users\MyUserAccount.ivy2\jars :: loading settings :: url = jar:file:/C:/bin/spark-2.4.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.apache.spark#spark-sql-kafka-0-10_2.11 added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent-. . .