ooyala / spark-jobserver

REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
Other
344 stars 135 forks source link

JobServer not working with spark 1.2.0 #102

Open richiesgr opened 9 years ago

richiesgr commented 9 years ago

Hi

When I call RDD.registerTempTable inside I get this error. My job is only submitting a SQL query to SparkSQL

"message": "class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath [/usr/lib/jvm/java-7-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-7-oracle/jre/lib/rt.jar:/usr/lib/jvm/java-7-oracle/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-7-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-7-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-7-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-7-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-7-oracle/jre/classes:/home/richard/.sbt/boot/scala-2.10.4/lib/scala-library.jar:/home/richard/.sbt/boot/scala-2.10.4/lib/scala-compiler.jar:/home/richard/.sbt/boot/scala-2.10.4/lib/jline.jar:/home/richard/.sbt/boot/scala-2.10.4/lib/scala-reflect.jar:/home/richard/.sbt/boot/scala-2.10.4/lib/jansi.jar] not found.", "errorClass": "scala.reflect.internal.MissingRequirementError", "stack": ["scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)", "scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)", "scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)", "scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)", "scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)", "scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)", "scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)", "org.apache.spark.sql.catalyst.ScalaReflection$$typecreator1$1.apply(ScalaReflection.scala:115)", "scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)", "scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)", "scala.reflect.api.TypeTags$class.typeOf(TypeTags.scala:335)", "scala.reflect.api.Universe.typeOf(Universe.scala:59)", "org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:115)", "org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)", "org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:100)", "org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)", "org.apache.spark.sql.catalyst.ScalaReflection$class.attributesFor(ScalaReflection.scala:94)", "org.apache.spark.sql.catalyst.ScalaReflection$.attributesFor(ScalaReflection.scala:33)", "org.apache.spark.sql.SQLContext.createSchemaRDD(SQLContext.scala:111)", "com.inneractive.tesla.jobs.RawToParquet$.runJob(RawToParquet.scala:51)", "spark.jobserver.SparkJobServer$class.runJob(SparkSQLJobServer.scala:24)", "spark.jobserver.SparkParquetJobServer$.runJob(SparkSQLJobServer.scala:47)", "spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:219)", "scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)", "scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)", "akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)", "akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)", "scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)", "scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)", "scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)", "scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"]