Closed SummerBulb closed 6 years ago
Thanks for trying this! I think you are probably still using sparklens2.11-0.1.0.jar instead of --jars sparklens2.10-0.1.0.jar which will be generated in spark1.6 branch. SparkListener interface changed in spark 2.0.0 and hence the need for two versions. Also spark 1.6 uses scala 2.10.5 whereas spark 2.0.0+ uses scala 2.11.8. Please try with sparklens2.10-0.1.0.jar generated in spark_1.6 branch.
@iamrohit thanks, I was, indeed, on the wrong branch.
Unfortunately, I am now getting a different error:
18/03/23 08:50:11 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.AbstractMethodError
at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:64)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:55)
at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(AsynchronousListenerBus.scala:80)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:64)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1181)
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
Hey @SummerBulb, Which specific version of spark are you using? 1.6.0 / 1.6.1 / 1.6.2 / 1.6.3 If you are using any custom distribution which has modified the SparkListener interface, please change the spark core dependency in build.sbt to compile against your specific distribution.
Thanks!
I changed build.sbt
to the following, and it ran without any errors.
name := "sparklens"
organization := "com.qubole"
version := "0.1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0-cdh5.7.6"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
resolvers += "cloudera" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
Following the instructions on the home page, I built the code, and used it in the spark-submit. I aded the jar to the list of jars, and added the
--conf
.Here is the error I get:
The same error occurs when running Scala-Spark