aehrc / VariantSpark

machine learning for genomic variants
http://bioinformatics.csiro.au/variantspark
Other
141 stars 45 forks source link

Install on our supercomputer throws ```java.lang.IncompatibleClassChangeError: Inconsistent constant pool data in classfile for class au/csiro/pbdava/ssparkle/spark/SparkApp. Method 'java.lang.String $anonfun$spark$1(au.csiro.pbdava.ssparkle.spark.SparkApp)' at index 118 is CONSTANT_MethodRef and should be CONSTANT_InterfaceMethodRef``` #231

Closed surak closed 11 months ago

surak commented 1 year ago

I install it using Maven 3.9.2, Arrow/8.0.0, Spark/3.3.1-CUDA-11.7, Java/11.0.16 and Python 3.10. I remove the version numbers from your dev-requirements.txt because they are broken.

VariantSpark/bin/variant-spark --spark --master local[*] -- importance -if /p/project/atmlaml/bazarova1/variantspark-git/VariantSpark/examples/data/chr22_1000.vcf -ff /p/project/atmlaml/bazarova1/variantspark-git/VariantSpark/examples/data/chr22-labels.csv -fc 22_16050408 -v -rn 500 -rbs 20 -ro -sr 13
Exception in thread "main" java.lang.IncompatibleClassChangeError: Inconsistent constant pool data in classfile for class au/csiro/pbdava/ssparkle/spark/SparkApp. Method 'java.lang.String $anonfun$spark$1(au.csiro.pbdava.ssparkle.spark.SparkApp)' at index 118 is CONSTANT_MethodRef and should be CONSTANT_InterfaceMethodRef
    at au.csiro.pbdava.ssparkle.spark.SparkApp.spark(SparkApp.scala:22)
    at au.csiro.pbdava.ssparkle.spark.SparkApp.spark$(SparkApp.scala:21)
    at au.csiro.variantspark.cli.ImportanceCmd.spark$lzycompute(ImportanceCmd.scala:23)
    at au.csiro.variantspark.cli.ImportanceCmd.spark(ImportanceCmd.scala:23)
    at au.csiro.pbdava.ssparkle.spark.SparkApp.sc(SparkApp.scala:26)
    at au.csiro.pbdava.ssparkle.spark.SparkApp.sc$(SparkApp.scala:26)
    at au.csiro.variantspark.cli.ImportanceCmd.sc$lzycompute(ImportanceCmd.scala:23)
    at au.csiro.variantspark.cli.ImportanceCmd.sc(ImportanceCmd.scala:23)
    at au.csiro.variantspark.cli.ImportanceCmd.run(ImportanceCmd.scala:62)
    at au.csiro.sparkle.common.args4j.ArgsApp.run(ArgsApp.java:46)
    at au.csiro.sparkle.cmd.CmdApp.runApp(CmdApp.java:9)
    at au.csiro.sparkle.cmd.CmdApp.runApp(CmdApp.java:18)
    at au.csiro.sparkle.cmd.MultiCmdApp.runCommandOrClass(MultiCmdApp.java:58)
    at au.csiro.sparkle.cmd.MultiCmdApp.run(MultiCmdApp.java:54)
    at au.csiro.sparkle.cmd.CmdApp.runApp(CmdApp.java:9)
    at au.csiro.sparkle.cmd.CmdApp.runApp(CmdApp.java:18)
    at au.csiro.pbdava.ssparkle.common.arg4j.AppRunner$.mains(AppRunner.scala:17)
    at au.csiro.variantspark.cli.VariantSparkApp$.main(VariantSparkApp.scala:25)
    at au.csiro.variantspark.cli.VariantSparkApp.main(VariantSparkApp.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/06/30 15:39:58 INFO ShutdownHookManager: Shutdown hook called
23/06/30 15:39:58 INFO ShutdownHookManager: Deleting directory /tmp/spark-607d17a9-a237-4087-90d0-30815dc7e10d
rocreguant commented 1 year ago

Why do you say they are broken? Also you may want to try the notebook options, its API is more up to date