ytchen0323 / cloud-scale-bwamem

Apache License 2.0
15 stars 9 forks source link

Command Failure #21

Open Rokshan2016 opened 7 years ago

Rokshan2016 commented 7 years ago

Hi, I run this command in cloud-scale-bwamem: SPARK_DRIVER_MEMORY=6g /opt/cloudera/parcels/CDH-5.10.1-1.cdh5.10.1.p0.10/bin/spark-submit --executor-memory 6g --class cs.ucla.edu.bwaspark.BWAMEMSpark /home/rokshan.jahan/project/spark-genome-alignment-demo/build/cloud-scale-bwamem/target/cloud-scale-bwamem-0.2.2-assembly.jar cs-bwamem -bfn 1 -bPSW 1 -sbatch 10 -bPSWJNI 1 -oChoice 2 -oPath hdfs://ip-10-48-3-5.ips.local:8020/user/rokshan.jahan/data/bwamem.adam -localRef 1 -R "@RGID:HCC1954LB:HCC1954SM:HCC1954" -isSWExtBatched 1 -bSWExtSize 32768 -FPGAAccSWExt 0 -FPGASWExtThreshold 64 -jniSWExtendLibPath /home/rokshan.jahan/project/spark-genome-alignment-demo/build/cloud-scale-bwamem/target/jniSWExtend.so 1 hdfs://ip-10-48-3-5.ips.local:8020/user/rokshan.jahan/data/Homo_sapiens_assembly18.fasta hdfs://ip-10-48-3-5.ips.local:8020/user/rokshan.jahan/data/SRR1517848.fastq

I am getting this error:

Error on reading header Load Index Files Exception in thread "main" java.lang.AssertionError: assertion failed at scala.Predef$.assert(Predef.scala:165) at cs.ucla.edu.bwaspark.datatype.BinaryFileReadUtil$.readByteArray(BinaryFileReadUtil.scala:237) at cs.ucla.edu.bwaspark.datatype.BWAIdxType.pacLoader$1(BWAIdxType.scala:80) at cs.ucla.edu.bwaspark.datatype.BWAIdxType.load(BWAIdxType.scala:89) at cs.ucla.edu.bwaspark.FastMap$.memMain(FastMap.scala:119) at cs.ucla.edu.bwaspark.BWAMEMSpark$.main(BWAMEMSpark.scala:302) at cs.ucla.edu.bwaspark.BWAMEMSpark.main(BWAMEMSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/09/18 19:26:01 INFO spark.SparkContext: Invoking stop() from shutdown hook 17/09/18 19:26:01 INFO ui.SparkUI: Stopped Spark web UI at http://10.48.3.63:4040 17/09/18 19:26:01 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread 17/09/18 19:26:01 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors 17/09/18 19:26:01 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down 17/09/18 19:26:01 INFO cluster.YarnClientSchedulerBackend: Stopped 17/09/18 19:26:01 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/09/18 19:26:01 INFO storage.MemoryStore: MemoryStore cleared 17/09/18 19:26:01 INFO storage.BlockManager: BlockManager stopped 17/09/18 19:26:01 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 17/09/18 19:26:01 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/09/18 19:26:01 INFO spark.SparkContext: Successfully stopped SparkContext 17/09/18 19:26:01 INFO util.ShutdownHookManager: Shutdown hook called 17/09/18 19:26:01 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/09/18 19:26:01 INFO util.ShutdownHookManager: Deleting directory /data1/tmp/spark-8c981d74-6eca-42b2-b3ef-b8a1c6612ce4 [rokshan.jahan@ip-1 cloud-scale-bwamem]$ ^C [rokshan.jahan@ip- cloud-scale-bwamem]$

Any suggetion will be helpful.

Thanks