hammerlab / guacamole

Spark-based variant calling, with experimental support for multi-sample somatic calling (including RNA) and local assembly
Apache License 2.0
84 stars 21 forks source link

Breeze NoSuchMethodError #585

Closed ryan-williams closed 8 years ago

ryan-williams commented 8 years ago

Some parts of Breeze need to be shaded in order for Guacamole to run via spark-submit; mllib depends on breeze 0.11.2 and I think that ends up on the classpath ahead of our 0.12 dependency when we use spark-{submit,shell}.

Example crash, from just after #571 was merged:

git checkout b97411d
mvp -Pguac,deps
spark-submit --driver-memory 10g --jars target/guacamole-deps-only-0.0.1-SNAPSHOT.jar target/guacamole-0.0.1-SNAPSHOT.jar somatic-standard --normal-reads src/test/resources/synth1.normal.100k-200k.withmd.bam --tumor-reads src/test/resources/synth1.tumor.100k-200k.withmd.bam --reference-fasta $ref --out /tmp/foo.vcf
…
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 42, localhost): java.lang.NoSuchMethodError: breeze.linalg.sum$.sumSummableThings(Lscala/Predef$$less$colon$less;Lbreeze/generic/UFunc$UImpl2;)Lbreeze/generic/UFunc$UImpl;
    at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:161)
    at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:156)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
    at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfGenotypes(Likelihood.scala:156)
    at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfAllPossibleGenotypesFromPileup(Likelihood.scala:78)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$.findPotentialVariantAtLocus(SomaticStandardCaller.scala:182)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:95)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:94)
    at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:84)
    at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:79)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:65)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:55)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:141)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:131)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:284)
    at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
    at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
    at org.apache.spark.rdd.RDD.count(RDD.scala:1157)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$.run(SomaticStandardCaller.scala:107)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$.run(SomaticStandardCaller.scala:53)
    at org.hammerlab.guacamole.commands.SparkCommand.run(SparkCommand.scala:12)
    at org.hammerlab.guacamole.commands.Command.run(Command.scala:27)
    at org.hammerlab.guacamole.Main$.main(Main.scala:49)
    at org.hammerlab.guacamole.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoSuchMethodError: breeze.linalg.sum$.sumSummableThings(Lscala/Predef$$less$colon$less;Lbreeze/generic/UFunc$UImpl2;)Lbreeze/generic/UFunc$UImpl;
    at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:161)
    at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:156)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
    at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfGenotypes(Likelihood.scala:156)
    at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfAllPossibleGenotypesFromPileup(Likelihood.scala:78)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$.findPotentialVariantAtLocus(SomaticStandardCaller.scala:182)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:95)
    at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:94)
    at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:84)
    at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:79)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:65)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:55)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:141)
    at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:131)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:284)
    at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
    at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

I poked at how to do the shading a bit just now but couldn't get it working.

arahuja commented 8 years ago

Ah sorry I have this fixed locally - spark <2.0 brings in an older version of breeze. I fixed this by adding an additional relocation as follows

<relocation>
    <pattern>breeze.linalg</pattern>
     <shadedPattern>org.hammerlab.breeze</shadedPattern>
         <includes>
            <include>breeze.linalg.**</include>
          </includes>
</relocation>

I don't think that is the right solution, but excluding breeze is not sufficient since when you run with spark-assembly on the class path it is included there. I tried a variety of userClassPathFirst parameters available but none solved the issue.

I think we can just move to Spark 2.0 now though?

ryan-williams commented 8 years ago

Weird, I tried exactly that locally and wasn't seeing it work, but I'm sure I was just checking it wrong, it was late last night.

AFAICT Spark 2.0.0 still uses Breeze 0.11.2; they upgraded breeze on July 19 right around when the 2.0 branch seems to have been cut, but I guess it didn't make it?

Either way, I think we should just fix it now and not couple it to a Spark 2.0 upgrade.

ryan-williams commented 8 years ago

I don't think that is the right solution

OOC why not? This seems like exactly the case where a targeted shading is the right way to get what we want, similarly to the guava situation

arahuja commented 8 years ago

OOC why not? This seems like exactly the case where a targeted shading is the right way to get what we want, similarly to the guava situation

Yea, that's true. I don't have a reason why really, just seems kludge-y.

Either way, I think we should just fix it now and not couple it to a Spark 2.0 upgrade.

Sure, I'll verify this still works on master and make a PR

arahuja commented 8 years ago

Also, double-checking, it seems the above was what I tried orignally and didn't work, but the following did.

                    <relocation>
                      <pattern>breeze</pattern>
                      <shadedPattern>org.hammerlab.breeze</shadedPattern>
                      <includes>
                        <include>breeze.**</include>
                      </includes>
                    </relocation>
ryan-williams commented 8 years ago

Yea I just noticed that we need some breeze.math things shaded in addition to breeze.linalg.

Also my previous approach wasn't working because I had a typo in the breeze line in:

                  <artifactSet>
                    <includes>
                      <include>org.hammerlab.guacamole:*</include>
                      <include>com.google.guava:*</include>
                      <include>org.scalanlp:breeze_${scala.version.prefix}</include>
                    </includes>
                  </artifactSet>
ryan-williams commented 8 years ago

I'm trying a version with a second breeze relocation just for breeze.math though it doesn't seem to have changed what ends up in the guac JAR…

ryan-williams commented 8 years ago

Yea, that's true. I don't have a reason why really, just seems kludge-y.

Agreed, it's annoying to have to deal with, but there isn't really another option for situations like this, afaik!

Sure, I'll verify this still works on master and make a PR

Sounds great, thanks.

Locally, I'm up to my 3rd breeze-subpkg relocation, numerics… seems like shading all of breeze may be the way to go.

ryan-williams commented 8 years ago

Also re: my

it doesn't seem to have changed what ends up in the guac JAR…

turns out I was checking by just counting hits for "breeze" in a jar tf, but I should have been grepping for "hammerlab/breeze" because all of breeze is in there either way, TIL.