ooyala / spark-jobserver

REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.
Other
344 stars 135 forks source link

Spark 1.0 NoSuchMethodError #29

Open chipsenkbeil opened 10 years ago

chipsenkbeil commented 10 years ago

When upgrading from Spark 0.9.1-incubating to Spark 1.0, I am now getting NoSuchMethodError thrown:

curl -X POST "ADDRESS/contexts/my_context?spark.cores.max=2&spark.executor.memory=512m"

{
  "status": "CONTEXT INIT ERROR",
  "result": {
    "message":
"org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;",
    "errorClass": "java.lang.NoSuchMethodError",
    "stack":
["spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)",
"spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:251)",
"spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:103)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)",
"ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)",
"ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)",
"scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)",
"ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)",
"akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)",
"akka.actor.ActorCell.invoke(ActorCell.scala:456)",
"akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)",
"akka.dispatch.Mailbox.run(Mailbox.scala:219)",
"akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)",
"scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)",
"scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)",
"scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)",
"scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"]
  }
}

Am I missing something? Or did something in Spark's interface change that needs to be updated in the job server? Are there plans to update the job server for Spark 1.0 if that is the case?

velvia commented 10 years ago

Hi Chip,

I think we need to recompile job server for Spark 1.0, some API must have changed.

-Evan To be free is not merely to cast off one's chains, but to live in a way that respects & enhances the freedom of others. (#NelsonMandela)

On Jun 6, 2014, at 11:19 AM, Chip Senkbeil notifications@github.com wrote:

When upgrading from Spark 0.9.1-incubating to Spark 1.0, I am now getting NoSuchMethodError thrown:

curl -X POST "ADDRESS/contexts/my_context?spark.cores.max=2&spark.executor.memory=512m"

{ "status": "CONTEXT INIT ERROR", "result": { "message": "org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;", "errorClass": "java.lang.NoSuchMethodError", "stack": ["spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)", "spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:251)", "spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:103)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)", "ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)", "akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)", "akka.actor.ActorCell.invoke(ActorCell.scala:456)", "akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)", "akka.dispatch.Mailbox.run(Mailbox.scala:219)", "akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)", "scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)", "scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)", "scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)", "scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"] } } Am I missing something? Or did something in Spark's interface change that needs to be updated in the job server? Are there plans to update the job server for Spark 1.0 if that is the case?

— Reply to this email directly or view it on GitHub.

chipsenkbeil commented 10 years ago

I attempted to rebuild with Spark 1.0.0 instead of Spark 0.9.1; however, 31 tests fail when I successfully compile.

I thought that maybe I had missed something important (such as toArray being deprecated) and grabbed the changes from here: https://github.com/ooyala/spark-jobserver/pull/30

I'm getting the same failures for sbt test

fail1 fail2 fail3 fail4 fail5 fail6 fail7 fail8 fail9

mnarrell commented 10 years ago

Is there any progress on this issue?

nightwolfzor commented 10 years ago

+1 Getting this error as well. Cant seem to figure out why the test pass with this. I'm guessing because they use a local context..

happygeorge01 commented 10 years ago

+1 Get this error as well. Can't integrate these two things together.

ryleg commented 10 years ago

I don't know if it fixes this issue, but I was having a similar error on Spark 1.0.0 that was fixed by changing the project/Dependencies.scala to this:

lazy val sparkDeps = Seq( "org.apache.spark" %% "spark-core" % "1.0.0" % "provided" exclude("io.netty", "netty-all"), // Force netty version. This avoids some Spark netty dependency problem. "io.netty" % "netty" % "3.6.6.Final" )

Here is the error I was getting before changing it:

job-server[ERROR] Uncaught error from thread [JobServer-akka.actor.default-dispatcher-3] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[JobServer] job-server[ERROR] java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; job-server[ERROR] at com.radius.matchstick.MatchApp.(MatchApp.scala:20) job-server[ERROR] at com.radius.matchstick.MatchJob$.runJob(MatchJob.scala:43) job-server[ERROR] at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:218) job-server[ERROR] at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) job-server[ERROR] at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) job-server[ERROR] at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42) job-server[ERROR] at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) job-server[ERROR] at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) job-server[ERROR] at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) job-server[ERROR] at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) job-server[ERROR] at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) job-server ... finished with exit code 255

armtuk commented 9 years ago

So, I'm getting a similar problem building now against Spark 1.1.0 - when I update the depdnency in project/Dependencies.scala to spark-1.1.0 (Curious, even though I have build 0.4, it's set to spark 0.9.1), the spark.jobserver.JobManagerSpec fails.

velvia commented 9 years ago

Are you using spark-jobserver/spark-jobserver? The master is now at Spark 1.1.0, you should not need to update Dependencies.scala yourself. Furthermore the build and test status is verified by Travis CI now.

On Tue, Oct 21, 2014 at 2:42 PM, Alex Turner notifications@github.com wrote:

So, I'm getting a similar problem building now against Spark 1.1.0 - when I update the depdnency in project/Dependencies.scala to spark-1.1.0 (Curious, even though I have build 0.4, it's set to spark 0.9.1), the spark.jobserver.JobManagerSpec fails.

— Reply to this email directly or view it on GitHub https://github.com/ooyala/spark-jobserver/issues/29#issuecomment-59976965 .

The fruit of silence is prayer; the fruit of prayer is faith; the fruit of faith is love; the fruit of love is service; the fruit of service is peace. -- Mother Teresa

mvenkatm commented 9 years ago

I am facing similar issue on my workstation. I have Spark 1.0.2 and job server 0.4.

java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map; at spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34) at spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:251) at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:103) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26) at ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:20) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

velvia commented 9 years ago

How are you running it? What scripts are you running? Is there any way to check your jar path / classpath? Sounds like class conflicts.

On Tue, Nov 4, 2014 at 11:07 AM, mvenkatm notifications@github.com wrote:

I am facing similar issue on my workstation. I have Spark 1.0.2 and job server 0.4.

java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map; at spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34) at spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:251) at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:103) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26) at ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35) at ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:20) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

— Reply to this email directly or view it on GitHub https://github.com/ooyala/spark-jobserver/issues/29#issuecomment-61695582 .

The fruit of silence is prayer; the fruit of prayer is faith; the fruit of faith is love; the fruit of love is service; the fruit of service is peace. -- Mother Teresa