Open chenzhaohangbj opened 5 years ago
Hi, do you use external shuffle service?
my spark conf: spark.driver.extraClassPath /home/bigdata/local/spark-rdma-3.1-for-spark-2.1.0-jar-with-dependencies.jar spark.executor.extraClassPath /home/bigdata/local/spark-rdma-3.1-for-spark-2.1.0-jar-with-dependencies.jar spark.shuffle.manager org.apache.spark.shuffle.rdma.RdmaShuffleManager spark.shuffle.compress false spark.shuffle.spill.compress false spark.broadcast.compress false spark.broadcast.checksum false spark.locality.wait 0
So in release tar - there's prebuilded jars for spark versions started from 2.0 to 2.4.
Do you try to use with different spark version?
yes,is not ok.
@chenzhaohangbj which spark version and which SparkRDMA jar do you use?
spark 2.1.0 spark 2.1.1 spark 2.3.0
So you need to use:
spark 2.1.0 - spark-rdma-3.1-for-spark-2.1.0-jar-with-dependencies.jar
spark 2.1.1 - spark-rdma-3.1-for-spark-2.1.0-jar-with-dependencies.jar
spark 2.3.0 - spark-rdma-3.1-for-spark-2.3.0-jar-with-dependencies.jar
nodemanager need which jar?
Nodemanager doesn't need any jar. We don't support external shuffle service yet.
spark rdma shuffle and spark shuffle can compatible on nodemanager?
If you don't use External Yarn Shuffle Service - then nodemanager is used only to launch Spark application. Spark itself will instantiate configured shuffle service. SparkRDMA is fully compatible with default Spark Shuffle.
I notice that it is hardcoded in https://github.com/apache/spark/blob/master/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockResolver.java:150 that spark external shuffle only support two shufflemanager types.So if I edit the code and compile spark,can the SparkRDMA works with external shuffle enabled?
When I run Spark on yarn Spark2.1.0 Hadoop2.7.3
nodemanger pull-in spark-2.1.0-yarn-shuffle.jar,but when spark version is not spark-2.1.0,container can not launch .