holdenk / spark-testing-base

Base classes to use when writing tests with Spark
Apache License 2.0
1.51k stars 359 forks source link

java.lang.IncompatibleClassChangeError #163

Open aminaaslam opened 7 years ago

aminaaslam commented 7 years ago

I am getting this error while running a very simple unit test case. using these versions spark-testing-base_2.10.version "2.0.0_0.4.7"

This problem occurs on this line // Run the assertions on the result and expected JavaRDDComparisons.assertRDDEquals( JavaRDD.fromRDD(JavaPairRDD.toRDD(result), tag), JavaRDD.fromRDD(JavaPairRDD.toRDD(expectedRDD), tag)); I am aware of this issue ScalaTest 3.0.0 Support #137 but it am not using ScalaTest

java.lang.IncompatibleClassChangeError at org.apache.spark.internal.Logging$class.$init$(Logging.scala:35) at com.holdenkarau.spark.testing.Utils$.(Utils.scala:33) at com.holdenkarau.spark.testing.Utils$.(Utils.scala) at com.holdenkarau.spark.testing.JavaRDDComparisons$.compareRDD(JavaRDDComparisons.scala:41) at com.holdenkarau.spark.testing.JavaRDDComparisons$.assertRDDEquals(JavaRDDComparisons.scala:29) at com.holdenkarau.spark.testing.JavaRDDComparisons.assertRDDEquals(JavaRDDComparisons.scala) at com.baesystems.ai.analytics.util.RDDUtilTest.verifyHistogramTest(RDDUtilTest.java:52) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:678) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)

holdenk commented 7 years ago

Which version of Spark & Scala & Log4J are you using?

aminaaslam commented 7 years ago

I am using Spark 2.0.2, scala.version "2.11" and slf4j-log4j12-1.7.16

aminaaslam commented 7 years ago

Any help in this regard would be very helpful. Can someone guide me as to why i am getting this error.

holdenk commented 7 years ago

You have a mismatched version of Spark and Spark testing base, if the logging classes (or log4j versions) has changed at all between 2.0.0 and 2.0.2 that would explain the exception. Can you try this with a matching version?

GouravGit commented 7 years ago

I am getting a same kind of error but not able to figure out what the actual cause is. In my case the version is correct though (I think).

1.8 1.8 UTF-8 2.11 2.11.8 2.0.0 1.4.0 0.4.1 com.holdenkarau spark-testing-base_2.11 2.0.0_0.7.1 test SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/Users/gdutta2/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/Users/gdutta2/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 17/09/11 03:00:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable java.lang.IncompatibleClassChangeError: Class com.holdenkarau.spark.testing.Utils$ does not implement the requested interface org.apache.spark.internal.Logging at org.apache.spark.internal.Logging$class.$init$(Logging.scala:35) at com.holdenkarau.spark.testing.Utils$.(Utils.scala:33) at com.holdenkarau.spark.testing.Utils$.(Utils.scala) at com.holdenkarau.spark.testing.SparkContextProvider$class.setup(SparkContextProvider.scala:45) at com.metlife.adr.lrm.test.LrmTest.setup(LrmTest.scala:24) at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:38) at com.metlife.adr.lrm.test.LrmTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(LrmTest.scala:24) at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:41) at com.metlife.adr.lrm.test.LrmTest.beforeAll(LrmTest.scala:39) at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212) at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210) at com.metlife.adr.lrm.test.LrmTest.run(LrmTest.scala:24) at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:678) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
wuzhim commented 4 years ago

I met the same problem. I put spark-testing-basing in the first of my dependencies. I do not know why this could solve this problem, but it worked for me. Hope that could help somebody.

sbly commented 3 years ago

I am still getting this in the latest version. All my versions match so I think it must be something else. These are the relevant library dependencies:

      libraryDependencies += "com.holdenkarau" %% "spark-testing-base" % "3.1.2_1.1.0" % "test",
      libraryDependencies += "org.apache.spark" %% "spark-core" % "3.1.2" % "provided",
      libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.1.2" % "provided",
      libraryDependencies += "org.scalactic" %% "scalactic" % "3.2.9",
      libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.9" % "test",