ooyala / spark-jobserver

REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
Other
344 stars 135 forks source link

graphx VertexRDD java.lang.ClassNotFoundException #122

Closed zakibenz closed 7 years ago

zakibenz commented 7 years ago

I am developing a SparkJob on jobserver (v0.6.2 spark 1.6.1) using spark graphx and I am running into the following exception when trying to launch my job on Spark JobServer:

    {
  "status": "JOB LOADING FAILED",
  "result": {
    "errorClass": "java.lang.NoClassDefFoundError",
    "cause": "org.apache.spark.graphx.VertexRDD",
    "stack": ["java.net.URLClassLoader.findClass(URLClassLoader.java:381)", "java.lang.ClassLoader.loadClass(ClassLoader.java:424)", "java.lang.ClassLoader.loadClass(ClassLoader.java:357)", "java.lang.Class.getDeclaredFields0(Native Method)", "java.lang.Class.privateGetDeclaredFields(Class.java:2583)", "java.lang.Class.getField0(Class.java:2975)", "java.lang.Class.getField(Class.java:1701)", "spark.jobserver.util.JarUtils$.loadObject(JarUtils.scala:61)", "spark.jobserver.util.JarUtils$.loadClassOrObject(JarUtils.scala:37)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:46)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:37)", "spark.jobserver.util.LRUCache.get(LRUCache.scala:35)", "spark.jobserver.JobCache.getSparkJob(JobCache.scala:37)", "spark.jobserver.JobManagerActor$$anonfun$startJobInternal$1.apply$mcV$sp(JobManagerActor.scala:216)", "scala.util.control.Breaks.breakable(Breaks.scala:37)", "spark.jobserver.JobManagerActor.startJobInternal(JobManagerActor.scala:192)", "spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:144)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)", "ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)", "akka.actor.Actor$class.aroundReceive(Actor.scala:467)", "ooyala.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)", "akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)", "akka.actor.ActorCell.invoke(ActorCell.scala:487)", "akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)", "akka.dispatch.Mailbox.run(Mailbox.scala:220)", "akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)", "scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)", "scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)", "scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)", "scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"],
    "causingClass": "java.lang.ClassNotFoundException",
    "message": "org/apache/spark/graphx/VertexRDD"
  }

Altough I've included graphx dependency in my build.sbt and in Dependecy.scala on jobserver.

Any Help?

zakibenz commented 7 years ago

Sorry I've not seen the troubleshooting session, the solution was there : https://github.com/spark-jobserver/spark-jobserver/blob/master/doc/troubleshooting.md#javalangclassnotfoundexception-when-staring-spark-jobserver-from-sbt

Before typing reStart in sbt, type project job-server-extras and only then start it using reStart