Describe the bug
This is almost identical to #189 . Delta also provides only-one-version support in one certain release jar.
When using incompatible Delta jar against Spark, the following error is thrown:
java.lang.IncompatibleClassChangeError: class org.apache.spark.sql.catalyst.plans.logical.DeltaDelete has interface org.apache.spark.sql.catalyst.plans.logical.UnaryNode as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.delta.DeltaAnalysis.apply(DeltaAnalysis.scala:64)
at org.apache.spark.sql.delta.DeltaAnalysis.apply(DeltaAnalysis.scala:57)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:211)
Describe the bug This is almost identical to #189 . Delta also provides only-one-version support in one certain release jar. When using incompatible Delta jar against Spark, the following error is thrown:
Steps/Code to reproduce bug
Expected behavior No error.
Environment details (please complete the following information)
Solution Use Delta jar version = 1.1.0 for Spark 3.2.x Add More info in README to inform users.