java.lang.NoSuchMethodError: org.apache.hadoop.tracing.SpanReceiverHost.getInstance(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/tracing/SpanReceiverHost;
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:641)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:810)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:794)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1487)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1115)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:986)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:815)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:475)
at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:434)
Investigation so far:
I have run mvn dependency:tree to confirm that the hadoop-hdfs library version is upwards of 2.6* (confirmed for the transitive dependencies).
Checked also in ~/.m2/repository and cleaned-up the org/apache/hadoop directory to ensure there is no stale version present there.
Any help/pointers will be appreciated. I will post an update if I'm able to resolve it meanwhile.
I'm trying to write unit-tests to assert Spark and HDFS interactions. I am using SharedJavaSparkContext from spark-testing-base lib.
Below is the @BeforeClass setup:
The POM level dependency is as follows:
I'm getting a failure at this line:
Error Stack:
Investigation so far:
I have run mvn dependency:tree to confirm that the hadoop-hdfs library version is upwards of 2.6* (confirmed for the transitive dependencies).
Checked also in ~/.m2/repository and cleaned-up the org/apache/hadoop directory to ensure there is no stale version present there.
Any help/pointers will be appreciated. I will post an update if I'm able to resolve it meanwhile.