Closed amitkhanna1806 closed 3 weeks ago
VL (Velox)
[Expected behavior] Job should have run successfully with gluten plugin, this is a sample pi job which runs successfully without the plugin and [actual behavior]. Job failing with the error before even starting.(Error has been pasted below)
Spark-3.4.x
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master local[8] --conf spark.memory.offHeap.size=2g --conf spark.shuffle.manager=org.apache.spark.shuffle.sort.ColumnarShuffleManager --conf spark.plugins=io.glutenproject.GlutenPlugin examples/jars/spark-examples_2.12-3.4.1.jar 100 Gluten jar: gluten-velox-bundle-spark3.4_2.12-1.1.1.jar
Java Version: 8 VM Info: Distributor ID: Ubuntu Description: Ubuntu 20.04.6 LTS Release: 20.04 Codename: focal Linux spark-on-kubernetes-spark-operator-776fcddb69-68hqt 5.15.0-1067-azure #76-Ubuntu SMP Wed Jun 12 18:19:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Stack: [0x00007fe2136be000,0x00007fe2137be000], sp=0x00007fe2137baf90, free space=1011k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) C [libgluten.so+0x317353] gluten::Runtime::registerFactory(std::string const&, std::function<gluten::Runtime* (std::unordered_map<std::string, std::string, std::hash<std::string>, std::equal_to<std::string>, std::allocator<std::pair<std::string const, std::string> > > const&)>)+0x23 Java frames: (J=compiled Java code, j=interpreted, Vv=VM code) j io.glutenproject.init.NativeBackendInitializer.initialize([B)V+0 j io.glutenproject.init.NativeBackendInitializer.initializeBackend(Lscala/collection/Map;)V+28 j io.glutenproject.backendsapi.velox.ListenerApiImpl$.initializeNative(Lscala/collection/immutable/Map;)V+9 j io.glutenproject.backendsapi.velox.ListenerApiImpl.initialize(Lorg/apache/spark/SparkConf;)V+209 j io.glutenproject.backendsapi.velox.ListenerApiImpl.onDriverStart(Lorg/apache/spark/SparkConf;)V+41 j io.glutenproject.GlutenDriverPlugin.init(Lorg/apache/spark/SparkContext;Lorg/apache/spark/api/plugin/PluginContext;)Ljava/util/Map;+74 j org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(Lorg/apache/spark/internal/plugin/DriverPluginContainer;Lorg/apache/spark/api/plugin/SparkPlugin;)Lscala/collection/Iterable;+77 j org.apache.spark.internal.plugin.DriverPluginContainer$$Lambda$574.apply(Ljava/lang/Object;)Ljava/lang/Object;+8 j scala.collection.TraversableLike.$anonfun$flatMap$1(Lscala/collection/mutable/Builder;Lscala/Function1;Ljava/lang/Object;)Lscala/collection/mutable/Builder;+3 j scala.collection.TraversableLike$$Lambda$207.apply(Ljava/lang/Object;)Ljava/lang/Object;+9 j scala.collection.mutable.ResizableArray.foreach(Lscala/Function1;)V+23 j scala.collection.mutable.ResizableArray.foreach$(Lscala/collection/mutable/ResizableArray;Lscala/Function1;)V+2 j scala.collection.mutable.ArrayBuffer.foreach(Lscala/Function1;)V+2 j scala.collection.TraversableLike.flatMap(Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+14 j scala.collection.TraversableLike.flatMap$(Lscala/collection/TraversableLike;Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+3 j scala.collection.AbstractTraversable.flatMap(Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+3 j org.apache.spark.internal.plugin.DriverPluginContainer.<init>(Lorg/apache/spark/SparkContext;Ljava/util/Map;Lscala/collection/Seq;)V+32 j org.apache.spark.internal.plugin.PluginContainer$.apply(Lscala/util/Either;Ljava/util/Map;)Lscala/Option;+104 j org.apache.spark.internal.plugin.PluginContainer$.apply(Lorg/apache/spark/SparkContext;Ljava/util/Map;)Lscala/Option;+10 j org.apache.spark.SparkContext.<init>(Lorg/apache/spark/SparkConf;)V+1780 j org.apache.spark.SparkContext$.getOrCreate(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;+23 j org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;+30 j org.apache.spark.sql.SparkSession$Builder$$Lambda$280.apply()Ljava/lang/Object;+4 j scala.Option.getOrElse(Lscala/Function0;)Ljava/lang/Object;+8 j org.apache.spark.sql.SparkSession$Builder.getOrCreate()Lorg/apache/spark/sql/SparkSession;+184 j org.apache.spark.examples.SparkPi$.main([Ljava/lang/String;)V+11 j org.apache.spark.examples.SparkPi.main([Ljava/lang/String;)V+4 v ~StubRoutines::call_stub j sun.reflect.NativeMethodAccessorImpl.invoke0(Ljava/lang/reflect/Method;Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+0 j sun.reflect.NativeMethodAccessorImpl.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+100 j sun.reflect.DelegatingMethodAccessorImpl.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+6 j java.lang.reflect.Method.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+56 j org.apache.spark.deploy.JavaMainApplication.start([Ljava/lang/String;Lorg/apache/spark/SparkConf;)V+97 j org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+561 j org.apache.spark.deploy.SparkSubmit.doRunMain$1(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+192 j org.apache.spark.deploy.SparkSubmit.submit(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+65 j org.apache.spark.deploy.SparkSubmit.doSubmit([Ljava/lang/String;)V+78 j org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit([Ljava/lang/String;)V+2 j org.apache.spark.deploy.SparkSubmit$.main([Ljava/lang/String;)V+29 j org.apache.spark.deploy.SparkSubmit.main([Ljava/lang/String;)V+4 v ~StubRoutines::call_stub
Please follow with this document, thanks! https://github.com/apache/incubator-gluten/blob/main/docs/get-started/Velox.md
Backend
VL (Velox)
Bug description
[Expected behavior] Job should have run successfully with gluten plugin, this is a sample pi job which runs successfully without the plugin and [actual behavior]. Job failing with the error before even starting.(Error has been pasted below)
Spark version
Spark-3.4.x
Spark configurations
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master local[8] --conf spark.memory.offHeap.size=2g --conf spark.shuffle.manager=org.apache.spark.shuffle.sort.ColumnarShuffleManager --conf spark.plugins=io.glutenproject.GlutenPlugin examples/jars/spark-examples_2.12-3.4.1.jar 100 Gluten jar: gluten-velox-bundle-spark3.4_2.12-1.1.1.jar
System information
Java Version: 8 VM Info: Distributor ID: Ubuntu Description: Ubuntu 20.04.6 LTS Release: 20.04 Codename: focal Linux spark-on-kubernetes-spark-operator-776fcddb69-68hqt 5.15.0-1067-azure #76-Ubuntu SMP Wed Jun 12 18:19:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Relevant logs