rapidsai / spark-examples

[ARCHIVED] Moved to github.com/NVIDIA/spark-xgboost-examples
https://github.com/NVIDIA/spark-xgboost-examples
Apache License 2.0
70 stars 40 forks source link

Standalone GPU demo failed. #39

Closed xiaonans closed 5 years ago

xiaonans commented 5 years ago

Hi guys,

I can sucessfully run the Standalone CPU demo, but the GPU demo failed with the following error information:

Exception in thread "main" java.lang.NoSuchMethodException: org.apache.spark.sql.execution.datasources.FilePartition$.apply(int, scala.collection.Seq, scala.collection.Seq) at java.lang.Class.getMethod(Class.java:1786) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset$.ml$dmlc$xgboost4j$scala$spark$rapids$GpuDataset$$createFilePartition(GpuDataset.scala:764) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.ml$dmlc$xgboost4j$scala$spark$rapids$GpuDataset$$closePartition$1(GpuDataset.scala:276) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.ml$dmlc$xgboost4j$scala$spark$rapids$GpuDataset$$getFilePartitions(GpuDataset.scala:293) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset$$anonfun$partitions$1.apply(GpuDataset.scala:259) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset$$anonfun$partitions$1.apply(GpuDataset.scala:249) at scala.Option.getOrElse(Option.scala:121) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.partitions$lzycompute(GpuDataset.scala:249) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.partitions(GpuDataset.scala:249) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.buildRDD(GpuDataset.scala:99) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.mapColumnarSingleBatchPerPartition(GpuDataset.scala:108) at ml.dmlc.xgboost4j.scala.spark.rapids.GpuDataset.findNumClasses(GpuDataset.scala:146) at ml.dmlc.xgboost4j.scala.spark.XGBoostClassifier.getNumberClasses(XGBoostClassifier.scala:228) at ml.dmlc.xgboost4j.scala.spark.XGBoostClassifier.fit(XGBoostClassifier.scala:247) at ai.rapids.spark.examples.mortgage.GPUMain$$anonfun$6.apply(GPUMain.scala:69) at ai.rapids.spark.examples.mortgage.GPUMain$$anonfun$6.apply(GPUMain.scala:69) at ai.rapids.spark.examples.utility.Benchmark$.time(Benchmark.scala:21) at ai.rapids.spark.examples.mortgage.GPUMain$.main(GPUMain.scala:68) at ai.rapids.spark.examples.mortgage.GPUMain.main(GPUMain.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 19/09/15 15:47:31 INFO SparkContext: Invoking stop() from shutdown hook 19/09/15 15:47:31 INFO SparkUI: Stopped Spark web UI at http://10.19.206.100:4040 19/09/15 15:47:31 INFO StandaloneSchedulerBackend: Shutting down all executors 19/09/15 15:47:31 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down 19/09/15 15:47:31 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 19/09/15 15:47:31 INFO MemoryStore: MemoryStore cleared 19/09/15 15:47:31 INFO BlockManager: BlockManager stopped 19/09/15 15:47:31 INFO BlockManagerMaster: BlockManagerMaster stopped 19/09/15 15:47:31 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 19/09/15 15:47:31 INFO SparkContext: Successfully stopped SparkContext 19/09/15 15:47:31 INFO ShutdownHookManager: Shutdown hook called 19/09/15 15:47:31 INFO ShutdownHookManager: Deleting directory /tmp/spark-f3afac0f-fb32-4040-9b6a-4712efa1cd59 19/09/15 15:47:31 INFO ShutdownHookManager: Deleting directory /tmp/spark-3442df88-ff24-4712-8b26-1bd4d4e83021

My mvn -version output is

Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 21:51:28+0800) Maven home: /home/xiaonans/softwares/apache-maven-3.0.5 Java version: 1.8.0_222, vendor: Private Build Java home: /usr/lib/jvm/java-8-openjdk-amd64/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "4.15.0-55-generic", arch: "amd64", family: "unix"

chuanlihao commented 5 years ago

Hi Xiaonan, What Spark version was used when running the examples? I don't think Spark 2.4.4 is supported yet, maybe you could try version 2.4.3.

xiaonans commented 5 years ago

Thank @chuanlihao , I changed Spark version from 2.4.4 to 2.4.3, and that solved my problem.