apache / hudi

Upserts, Deletes And Incremental Processing on Big Data.
https://hudi.apache.org/
Apache License 2.0
5.36k stars 2.42k forks source link

[SUPPORT] Integration test broken after upgrading from 0.10.0 to 0.10.1 #4793

Closed pan3793 closed 2 years ago

pan3793 commented 2 years ago

Describe the problem you faced

The Apache Kyuubi (Incubating) Hudi Integration test broken after upgrading from 0.10.0 to 0.10.1.

https://github.com/apache/incubator-kyuubi/runs/5152924363

To Reproduce

The TEST CASE

https://github.com/apache/incubator-kyuubi/pull/1897

Expected behavior

Test case pass as same as Hudi 0.10.0.

Environment Description

Additional context

Stacktrace

- get tables *** FAILED ***
  java.sql.SQLException: Error operating EXECUTE_STATEMENT: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.CatalogTable.copy(Lorg/apache/spark/sql/catalyst/TableIdentifier;Lorg/apache/spark/sql/catalyst/catalog/CatalogTableType;Lorg/apache/spark/sql/catalyst/catalog/CatalogStorageFormat;Lorg/apache/spark/sql/types/StructType;Lscala/Option;Lscala/collection/Seq;Lscala/Option;Ljava/lang/String;JJLjava/lang/String;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/collection/Seq;ZZLscala/collection/immutable/Map;)Lorg/apache/spark/sql/catalyst/catalog/CatalogTable;
    at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand$.createTableInCatalog(CreateHoodieTableCommand.scala:136)
    at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.run(CreateHoodieTableCommand.scala:71)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
    at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
    at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687)
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
    at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
    at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
    at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
    at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:79)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at org.apache.kyuubi.engine.spark.operation.SparkOperation.withLocalProperties(SparkOperation.scala:88)
    at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.org$apache$kyuubi$engine$spark$operation$ExecuteStatement$$executeStatement(ExecuteStatement.scala:73)
    at org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:105)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)
  at org.apache.kyuubi.jdbc.hive.KyuubiStatement.waitForOperationToComplete(KyuubiStatement.java:405)
  at org.apache.kyuubi.jdbc.hive.KyuubiStatement.executeWithConfOverlay(KyuubiStatement.java:255)
  at org.apache.kyuubi.jdbc.hive.KyuubiStatement.execute(KyuubiStatement.java:249)
  at org.apache.kyuubi.operation.HudiMetadataTests.$anonfun$$init$$10(HudiMetadataTests.scala:74)
  at org.apache.kyuubi.operation.HudiMetadataTests.$anonfun$$init$$10$adapted(HudiMetadataTests.scala:66)
  at org.apache.kyuubi.operation.JDBCTestHelper.$anonfun$withMultipleConnectionJdbcStatement$3(JDBCTestHelper.scala:60)
  at org.apache.kyuubi.operation.JDBCTestHelper.$anonfun$withMultipleConnectionJdbcStatement$3$adapted(JDBCTestHelper.scala:60)
  at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
  at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
nsivabalan commented 2 years ago

may I know which hudi bundle or artifact you are using? ' with 0.10.1, for spark3, the bundle names have changed. hudi-spark3.0.3-bundle hudi-spark3.1.2-bundle

in previous release it was hudi-spark3-bundle*

pan3793 commented 2 years ago

@nsivabalan thanks for your reply.

may I know which hudi bundle or artifact you are using?

We use the vanilla jars instead of the bundle jar because

I think Hudi has room to improve the bundle jar to reduce dependency maintenance effort for users/downstream projects. Compared to other data lake formats, delta restricts to involve dependencies other than spark, the delta-core has only one transitive dependency jackson-core-asl which is not included in spark runtime jars. Iceberg provides runtime jar which is something like Hudi bundle jars but has such differences:

  1. The iceberg runtime jar does not contain classes that already exist in spark runtime libraries, e.g. curator
  2. The iceberg runtime jar relocates nearly every class other than org.apache.iceberg package to avoid potential class conflict with user classes.
  3. The iceberg provides runtime jars for each supported spark minor version, e.g. iceberg-spark-runtime-0.13.0.jar for spark 2.4.x, iceberg-spark3-runtime-0.13.0.jar from spark 3.0.x, iceberg-spark-runtime-3.1_2.12-0.13.0.jar for spark 3.1.x, iceberg-spark-runtime-3.2_2.12-0.13.0.jar for spark 3.2.x
nsivabalan commented 2 years ago

@xushiyan : Can you follow up here please. looks like good suggestions that can be taken into consideration. I will let you drive this. let me know if you need to jam.

nsivabalan commented 2 years ago

@pan3793 : are you folks still blocked on this?

pan3793 commented 2 years ago

No progress yet.

xushiyan commented 2 years ago

@pan3793 thanks for the feedback. for looking into this issue, can you post which vanilla jars you used? since this is a spark version conflict, it'll help us analyze.

loop in @XuQianJin-Stars here. maybe we can start driving an epic for this topic: improve dependency bundling, based on the feedback above

xushiyan commented 2 years ago

@pan3793 did you build the vanilla jars with spark 3 profile? since there is a spark version mismatch.

- get tables *** FAILED ***
  java.sql.SQLException: Error operating EXECUTE_STATEMENT: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.CatalogTable.copy(Lorg/apache/spark/sql/catalyst/TableIdentifier;Lorg/apache/spark/sql/catalyst/catalog/CatalogTableType;Lorg/apache/spark/sql/catalyst/catalog/CatalogStorageFormat;Lorg/apache/spark/sql/types/StructType;Lscala/Option;Lscala/collection/Seq;Lscala/Option;Ljava/lang/String;JJLjava/lang/String;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/collection/Seq;ZZLscala/collection/immutable/Map;)Lorg/apache/spark/sql/catalyst/catalog/CatalogTable;
    at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand$.createTableInCatalog(CreateHoodieTableCommand.scala:136)

@YannByron do you happen to know what might be mismatched here?

pan3793 commented 2 years ago

@xushiyan thanks for helping, and sorry I didn't notice your first reply.

can you post which vanilla jars you used?

Basically, we use the following jars related to Hudi, you can check more details in our project source code https://github.com/apache/incubator-kyuubi

pom.xml ``` org.apache.hadoop hadoop-common ${hadoop.version} org.apache.parquet parquet-avro ${parquet.version} org.apache.spark spark-avro_${scala.binary.version} ${spark.version} org.apache.hudi hudi-spark-common_${scala.binary.version} ${hudi.version} org.scala-lang scala-library org.apache.hudi hudi-timeline-service io.dropwizard.metrics * io.prometheus * log4j log4j org.apache.curator * org.apache.hadoop * org.apache.hbase hbase-server org.apache.orc * org.apache.hudi hudi-aws org.apache.hudi hudi-spark_${scala.binary.version} ${hudi.version} org.scala-lang scala-library org.apache.hudi hudi-spark-common_2.11 org.apache.hudi hudi-spark2_2.11 org.apache.curator * com.fasterxml.jackson.core * com.fasterxml.jackson.module * log4j log4j org.apache.hudi hudi-spark3_${scala.binary.version} ${hudi.version} org.apache.hudi hudi-spark-common_2.11 ```

did you build the vanilla jars with spark 3 profile?

No, we use the jar published by Hudi officially, it works fine on Hudi 0.10.0, but not Hudi 0.10.1

YannByron commented 2 years ago
s"""
           | create table $table (
           |  id int,
           |  name string,
           |  price double,
           |  ts long
           | ) using $format
           | options (
           |   primaryKey = 'id',
           |   preCombineField = 'ts',
           |   hoodie.bootstrap.index.class =
           |   'org.apache.hudi.common.bootstrap.index.NoOpBootstrapIndex'
           | )
       """.stripMargin

See it's hudi 0.10.1 and spark3.1.2 in https://github.com/apache/incubator-kyuubi/pull/1897/files. My test is ok in the same env, but i use bundle jar. In addition to the vanilla jar, what other hudi jars you put in the spark env ?

And, is there the same problem if you use spark-sql instead of kyuubi ? @pan3793

pan3793 commented 2 years ago

@YannByron

what other hudi jars you put in the spark env

Dependency Tree ``` (apache-kyuubi) ➜ apache-kyuubi git:(hudi) mvn -pl :kyuubi-spark-sql-engine_2.12 dependency:tree [INFO] Scanning for projects... [INFO] [INFO] -----------< org.apache.kyuubi:kyuubi-spark-sql-engine_2.12 >----------- [INFO] Building Kyuubi Project Engine Spark SQL 1.5.0-SNAPSHOT [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.1.1:tree (default-cli) @ kyuubi-spark-sql-engine_2.12 --- [INFO] org.apache.kyuubi:kyuubi-spark-sql-engine_2.12:jar:1.5.0-SNAPSHOT [INFO] +- org.apache.kyuubi:kyuubi-common_2.12:jar:1.5.0-SNAPSHOT:compile [INFO] | +- org.scala-lang:scala-library:jar:2.12.15:compile [INFO] | +- org.slf4j:slf4j-api:jar:1.7.35:compile [INFO] | +- org.slf4j:jcl-over-slf4j:jar:1.7.35:compile [INFO] | +- org.apache.logging.log4j:log4j-slf4j-impl:jar:2.17.1:compile [INFO] | +- org.apache.logging.log4j:log4j-api:jar:2.17.1:compile [INFO] | +- org.apache.logging.log4j:log4j-core:jar:2.17.1:compile [INFO] | +- org.apache.logging.log4j:log4j-1.2-api:jar:2.17.1:compile [INFO] | +- commons-codec:commons-codec:jar:1.15:compile [INFO] | +- org.apache.commons:commons-lang3:jar:3.10:compile [INFO] | +- com.google.guava:guava:jar:30.1-jre:compile [INFO] | | \- com.google.guava:failureaccess:jar:1.0.1:compile [INFO] | +- org.apache.thrift:libfb303:jar:0.9.3:compile [INFO] | +- org.apache.thrift:libthrift:jar:0.16.0:compile [INFO] | | \- org.apache.httpcomponents:httpcore:jar:4.4.15:compile [INFO] | +- org.apache.hive:hive-service-rpc:jar:2.3.9:compile [INFO] | +- jakarta.xml.bind:jakarta.xml.bind-api:jar:2.3.2:compile [INFO] | | \- jakarta.activation:jakarta.activation-api:jar:1.2.1:compile [INFO] | +- com.fasterxml.jackson.module:jackson-module-scala_2.12:jar:2.13.1:compile [INFO] | | \- com.thoughtworks.paranamer:paranamer:jar:2.8:compile [INFO] | \- com.fasterxml.jackson.core:jackson-databind:jar:2.13.1:compile [INFO] +- org.apache.kyuubi:kyuubi-ha_2.12:jar:1.5.0-SNAPSHOT:compile [INFO] | +- org.apache.curator:curator-framework:jar:2.12.0:compile [INFO] | | \- org.apache.curator:curator-client:jar:2.12.0:compile [INFO] | +- org.apache.curator:curator-recipes:jar:2.12.0:compile [INFO] | \- org.apache.zookeeper:zookeeper:jar:3.4.14:compile [INFO] +- org.apache.spark:spark-sql_2.12:jar:3.1.2:provided [INFO] | +- com.univocity:univocity-parsers:jar:2.9.1:provided [INFO] | +- org.apache.spark:spark-sketch_2.12:jar:3.1.2:provided [INFO] | +- org.apache.spark:spark-core_2.12:jar:3.1.2:provided [INFO] | | +- com.twitter:chill_2.12:jar:0.9.5:provided [INFO] | | +- com.twitter:chill-java:jar:0.9.5:provided [INFO] | | +- org.apache.spark:spark-launcher_2.12:jar:3.1.2:provided [INFO] | | +- org.apache.spark:spark-kvstore_2.12:jar:3.1.2:provided [INFO] | | | \- org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:provided [INFO] | | +- org.apache.spark:spark-network-common_2.12:jar:3.1.2:provided [INFO] | | +- org.apache.spark:spark-network-shuffle_2.12:jar:3.1.2:provided [INFO] | | +- org.apache.spark:spark-unsafe_2.12:jar:3.1.2:provided [INFO] | | +- javax.activation:activation:jar:1.1.1:provided [INFO] | | +- jakarta.servlet:jakarta.servlet-api:jar:4.0.4:provided [INFO] | | +- org.apache.commons:commons-math3:jar:3.4.1:provided [INFO] | | +- org.apache.commons:commons-text:jar:1.6:provided [INFO] | | +- com.ning:compress-lzf:jar:1.0.3:provided [INFO] | | +- org.xerial.snappy:snappy-java:jar:1.1.8.2:provided [INFO] | | +- org.lz4:lz4-java:jar:1.7.1:provided [INFO] | | +- com.github.luben:zstd-jni:jar:1.4.8-1:provided [INFO] | | +- org.roaringbitmap:RoaringBitmap:jar:0.9.0:provided [INFO] | | | \- org.roaringbitmap:shims:jar:0.9.0:provided [INFO] | | +- commons-net:commons-net:jar:3.1:provided [INFO] | | +- org.json4s:json4s-jackson_2.12:jar:3.7.0-M5:provided [INFO] | | | \- org.json4s:json4s-core_2.12:jar:3.7.0-M5:provided [INFO] | | | +- org.json4s:json4s-ast_2.12:jar:3.7.0-M5:provided [INFO] | | | \- org.json4s:json4s-scalap_2.12:jar:3.7.0-M5:provided [INFO] | | +- org.glassfish.jersey.core:jersey-client:jar:2.30:provided [INFO] | | | +- jakarta.ws.rs:jakarta.ws.rs-api:jar:2.1.6:provided [INFO] | | | \- org.glassfish.hk2.external:jakarta.inject:jar:2.6.1:provided [INFO] | | +- org.glassfish.jersey.core:jersey-common:jar:2.34:provided [INFO] | | | +- jakarta.annotation:jakarta.annotation-api:jar:1.3.5:provided [INFO] | | | \- org.glassfish.hk2:osgi-resource-locator:jar:1.0.3:provided [INFO] | | +- org.glassfish.jersey.core:jersey-server:jar:2.34:provided [INFO] | | | \- jakarta.validation:jakarta.validation-api:jar:2.0.2:provided [INFO] | | +- org.glassfish.jersey.containers:jersey-container-servlet:jar:2.30:provided [INFO] | | +- org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.34:provided [INFO] | | +- org.glassfish.jersey.inject:jersey-hk2:jar:2.34:provided [INFO] | | | +- org.glassfish.hk2:hk2-locator:jar:2.6.1:provided [INFO] | | | | +- org.glassfish.hk2.external:aopalliance-repackaged:jar:2.6.1:provided [INFO] | | | | +- org.glassfish.hk2:hk2-api:jar:2.6.1:provided [INFO] | | | | \- org.glassfish.hk2:hk2-utils:jar:2.6.1:provided [INFO] | | | \- org.javassist:javassist:jar:3.25.0-GA:provided [INFO] | | +- io.netty:netty-all:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-buffer:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-codec:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-common:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-handler:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-tcnative-classes:jar:2.0.46.Final:provided [INFO] | | | +- io.netty:netty-resolver:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-classes-epoll:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-native-unix-common:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-classes-kqueue:jar:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-native-epoll:jar:linux-x86_64:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-native-epoll:jar:linux-aarch_64:4.1.73.Final:provided [INFO] | | | +- io.netty:netty-transport-native-kqueue:jar:osx-x86_64:4.1.73.Final:provided [INFO] | | | \- io.netty:netty-transport-native-kqueue:jar:osx-aarch_64:4.1.73.Final:provided [INFO] | | +- com.clearspring.analytics:stream:jar:2.9.6:provided [INFO] | | +- io.dropwizard.metrics:metrics-core:jar:4.2.8:provided [INFO] | | +- io.dropwizard.metrics:metrics-jvm:jar:4.2.8:provided [INFO] | | +- io.dropwizard.metrics:metrics-json:jar:4.2.8:provided [INFO] | | +- io.dropwizard.metrics:metrics-graphite:jar:4.1.1:provided [INFO] | | +- io.dropwizard.metrics:metrics-jmx:jar:4.2.8:provided [INFO] | | +- org.apache.ivy:ivy:jar:2.4.0:provided [INFO] | | +- oro:oro:jar:2.0.8:provided [INFO] | | +- net.razorvine:pyrolite:jar:4.30:provided [INFO] | | +- net.sf.py4j:py4j:jar:0.10.9:provided [INFO] | | \- org.apache.commons:commons-crypto:jar:1.1.0:provided [INFO] | +- org.apache.spark:spark-catalyst_2.12:jar:3.1.2:provided [INFO] | | +- org.scala-lang.modules:scala-parser-combinators_2.12:jar:1.1.2:provided [INFO] | | +- org.codehaus.janino:janino:jar:3.0.16:provided [INFO] | | +- org.codehaus.janino:commons-compiler:jar:3.0.16:provided [INFO] | | +- org.antlr:antlr4-runtime:jar:4.8-1:provided [INFO] | | +- org.apache.arrow:arrow-vector:jar:2.0.0:provided [INFO] | | | +- org.apache.arrow:arrow-format:jar:2.0.0:provided [INFO] | | | +- org.apache.arrow:arrow-memory-core:jar:2.0.0:provided [INFO] | | | \- com.google.flatbuffers:flatbuffers-java:jar:1.9.0:provided [INFO] | | \- org.apache.arrow:arrow-memory-netty:jar:2.0.0:provided [INFO] | +- org.apache.spark:spark-tags_2.12:jar:3.1.2:provided [INFO] | +- org.apache.orc:orc-core:jar:1.5.12:provided [INFO] | | +- org.apache.orc:orc-shims:jar:1.5.12:provided [INFO] | | +- com.google.protobuf:protobuf-java:jar:2.5.0:provided [INFO] | | +- commons-lang:commons-lang:jar:2.6:provided [INFO] | | +- io.airlift:aircompressor:jar:0.10:provided [INFO] | | \- org.threeten:threeten-extra:jar:1.5.0:provided [INFO] | +- org.apache.orc:orc-mapreduce:jar:1.5.12:provided [INFO] | +- org.apache.hive:hive-storage-api:jar:2.7.2:provided [INFO] | +- org.apache.parquet:parquet-column:jar:1.10.1:provided [INFO] | | +- org.apache.parquet:parquet-common:jar:1.10.1:provided [INFO] | | \- org.apache.parquet:parquet-encoding:jar:1.10.1:provided [INFO] | +- org.apache.parquet:parquet-hadoop:jar:1.10.1:provided [INFO] | | +- org.apache.parquet:parquet-jackson:jar:1.10.1:provided [INFO] | | \- org.codehaus.jackson:jackson-core-asl:jar:1.9.13:provided [INFO] | +- org.apache.xbean:xbean-asm7-shaded:jar:4.15:provided [INFO] | \- org.spark-project.spark:unused:jar:1.0.0:provided [INFO] +- org.apache.spark:spark-repl_2.12:jar:3.1.2:provided [INFO] | \- org.apache.spark:spark-mllib_2.12:jar:3.1.2:provided [INFO] | +- org.apache.spark:spark-streaming_2.12:jar:3.1.2:provided [INFO] | +- org.apache.spark:spark-graphx_2.12:jar:3.1.2:provided [INFO] | | +- com.github.fommil.netlib:core:jar:1.1.2:provided [INFO] | | \- net.sourceforge.f2j:arpack_combined_all:jar:0.1:provided [INFO] | +- org.apache.spark:spark-mllib-local_2.12:jar:3.1.2:provided [INFO] | +- org.scalanlp:breeze_2.12:jar:1.0:provided [INFO] | | +- org.scalanlp:breeze-macros_2.12:jar:1.0:provided [INFO] | | +- com.github.wendykierp:JTransforms:jar:3.1:provided [INFO] | | | \- pl.edu.icm:JLargeArrays:jar:1.5:provided [INFO] | | +- com.chuusai:shapeless_2.12:jar:2.3.3:provided [INFO] | | | \- org.typelevel:macro-compat_2.12:jar:1.1.1:provided [INFO] | | +- org.typelevel:spire_2.12:jar:0.17.0-M1:provided [INFO] | | | +- org.typelevel:spire-macros_2.12:jar:0.17.0-M1:provided [INFO] | | | +- org.typelevel:spire-platform_2.12:jar:0.17.0-M1:provided [INFO] | | | +- org.typelevel:spire-util_2.12:jar:0.17.0-M1:provided [INFO] | | | +- org.typelevel:machinist_2.12:jar:0.6.8:provided [INFO] | | | \- org.typelevel:algebra_2.12:jar:2.0.0-M2:provided [INFO] | | | \- org.typelevel:cats-kernel_2.12:jar:2.0.0-M4:provided [INFO] | | \- org.scala-lang.modules:scala-collection-compat_2.12:jar:2.1.1:provided [INFO] | \- org.glassfish.jaxb:jaxb-runtime:jar:2.3.2:provided [INFO] | \- com.sun.istack:istack-commons-runtime:jar:3.0.8:provided [INFO] +- org.scala-lang:scala-compiler:jar:2.12.15:provided [INFO] | \- org.scala-lang.modules:scala-xml_2.12:jar:1.0.6:provided [INFO] +- org.scala-lang:scala-reflect:jar:2.12.15:provided [INFO] +- org.apache.hadoop:hadoop-client-api:jar:3.3.1:provided [INFO] +- org.apache.kyuubi:kyuubi-common_2.12:test-jar:tests:1.5.0-SNAPSHOT:test [INFO] +- commons-collections:commons-collections:jar:3.2.2:test [INFO] +- commons-io:commons-io:jar:2.8.0:test [INFO] +- org.apache.spark:spark-hive_2.12:jar:3.1.2:test [INFO] | +- org.apache.hive:hive-common:jar:2.3.7:test [INFO] | | +- commons-cli:commons-cli:jar:1.2:test [INFO] | | +- jline:jline:jar:0.9.94:test [INFO] | | +- org.apache.commons:commons-compress:jar:1.9:provided [INFO] | | +- com.tdunning:json:jar:1.8:test [INFO] | | \- com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter:jar:0.1.2:test [INFO] | +- org.apache.hive:hive-exec:jar:core:2.3.7:test [INFO] | | +- org.apache.hive:hive-vector-code-gen:jar:2.3.7:test [INFO] | | | \- org.apache.velocity:velocity:jar:1.5:test [INFO] | | +- org.antlr:antlr-runtime:jar:3.5.2:test [INFO] | | +- org.antlr:ST4:jar:4.0.4:test [INFO] | | +- com.google.code.gson:gson:jar:2.2.4:test [INFO] | | \- stax:stax-api:jar:1.0.1:test [INFO] | +- org.apache.hive:hive-metastore:jar:2.3.7:test [INFO] | | +- javolution:javolution:jar:5.5.1:test [INFO] | | +- com.jolbox:bonecp:jar:0.8.0.RELEASE:test [INFO] | | +- com.zaxxer:HikariCP:jar:2.5.1:test [INFO] | | +- org.datanucleus:datanucleus-api-jdo:jar:4.2.4:test [INFO] | | +- org.datanucleus:datanucleus-rdbms:jar:4.1.19:test [INFO] | | +- commons-pool:commons-pool:jar:1.5.4:test [INFO] | | +- commons-dbcp:commons-dbcp:jar:1.4:test [INFO] | | +- javax.jdo:jdo-api:jar:3.0.1:test [INFO] | | | \- javax.transaction:jta:jar:1.1:test [INFO] | | \- org.datanucleus:javax.jdo:jar:3.2.0-m3:test [INFO] | | \- javax.transaction:transaction-api:jar:1.1:test [INFO] | +- org.apache.hive:hive-serde:jar:2.3.7:test [INFO] | | \- net.sf.opencsv:opencsv:jar:2.3:provided [INFO] | +- org.apache.hive:hive-shims:jar:2.3.7:test [INFO] | | +- org.apache.hive.shims:hive-shims-common:jar:2.3.7:test [INFO] | | +- org.apache.hive.shims:hive-shims-0.23:jar:2.3.7:test [INFO] | | \- org.apache.hive.shims:hive-shims-scheduler:jar:2.3.7:test [INFO] | +- org.apache.hive:hive-llap-common:jar:2.3.7:test [INFO] | +- org.apache.hive:hive-llap-client:jar:2.3.7:test [INFO] | +- org.apache.avro:avro:jar:1.8.2:provided [INFO] | | \- org.tukaani:xz:jar:1.5:provided [INFO] | +- org.apache.avro:avro-mapred:jar:hadoop2:1.8.2:provided [INFO] | | \- org.apache.avro:avro-ipc:jar:1.8.2:provided [INFO] | +- commons-httpclient:commons-httpclient:jar:3.1:test [INFO] | | \- commons-logging:commons-logging:jar:1.0.4:compile [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.5.6:compile [INFO] | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:provided [INFO] | +- joda-time:joda-time:jar:2.10.5:test [INFO] | +- org.jodd:jodd-core:jar:3.5.2:test [INFO] | +- com.google.code.findbugs:jsr305:jar:3.0.2:compile [INFO] | +- org.datanucleus:datanucleus-core:jar:4.1.17:test [INFO] | \- org.apache.derby:derby:jar:10.12.1.1:test [INFO] +- org.apache.kyuubi:kyuubi-hive-jdbc-shaded:jar:1.5.0-SNAPSHOT:test [INFO] +- org.apache.hadoop:hadoop-client-runtime:jar:3.3.1:test [INFO] | \- org.apache.htrace:htrace-core4:jar:4.1.0-incubating:test [INFO] +- org.slf4j:jul-to-slf4j:jar:1.7.35:test [INFO] +- org.apache.iceberg:iceberg-spark3-runtime:jar:0.13.0:test [INFO] +- org.apache.spark:spark-avro_2.12:jar:3.1.2:test [INFO] +- org.apache.hudi:hudi-spark-common_2.12:jar:0.10.1:test [INFO] | +- org.apache.hudi:hudi-client-common:jar:0.10.1:test [INFO] | | \- com.github.davidmoten:hilbert-curve:jar:0.2.2:test [INFO] | | \- com.github.davidmoten:guava-mini:jar:0.1.3:test [INFO] | +- org.apache.hudi:hudi-spark-client:jar:0.10.1:test [INFO] | +- org.apache.hudi:hudi-common:jar:0.10.1:test [INFO] | | +- org.apache.httpcomponents:fluent-hc:jar:4.4.1:test [INFO] | | +- org.rocksdb:rocksdbjni:jar:5.17.2:test [INFO] | | \- com.esotericsoftware:kryo-shaded:jar:4.0.2:provided [INFO] | | +- com.esotericsoftware:minlog:jar:1.3.0:provided [INFO] | | \- org.objenesis:objenesis:jar:2.5.1:provided [INFO] | \- org.apache.hudi:hudi-hive-sync:jar:0.10.1:test [INFO] | \- com.beust:jcommander:jar:1.72:test [INFO] +- org.apache.hudi:hudi-spark3_2.12:jar:0.10.1:test [INFO] | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.13.1:compile [INFO] | \- com.fasterxml.jackson.core:jackson-core:jar:2.13.1:compile [INFO] +- org.apache.parquet:parquet-avro:jar:1.10.1:test [INFO] | +- org.apache.parquet:parquet-format:jar:2.4.0:provided [INFO] | \- it.unimi.dsi:fastutil:jar:7.0.13:test [INFO] +- org.apache.hudi:hudi-spark_2.12:jar:0.10.1:test [INFO] | +- org.apache.hudi:hudi-hadoop-mr:jar:0.10.1:test [INFO] | \- org.apache.hudi:hudi-sync-common:jar:0.10.1:test [INFO] +- io.delta:delta-core_2.12:jar:1.0.1:test [INFO] +- org.apache.kyuubi:kyuubi-zookeeper_2.12:jar:1.5.0-SNAPSHOT:test [INFO] \- org.scalatest:scalatest_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-core_2.12:jar:3.2.9:test [INFO] | +- org.scalatest:scalatest-compatible:jar:3.2.9:test [INFO] | \- org.scalactic:scalactic_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-featurespec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-flatspec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-freespec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-funsuite_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-funspec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-propspec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-refspec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-wordspec_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-diagrams_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-matchers-core_2.12:jar:3.2.9:test [INFO] +- org.scalatest:scalatest-shouldmatchers_2.12:jar:3.2.9:test [INFO] \- org.scalatest:scalatest-mustmatchers_2.12:jar:3.2.9:test [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1.702 s [INFO] Finished at: 2022-03-01T11:24:46+08:00 [INFO] ------------------------------------------------------------------------ ```

is there the same problem if you use spark-sql instead of kyuubi

Basically, the component kyuubi-spark-sql-engine is just a Spark application which start a thrift server in Spark driver process and exposes interface which compatible with HiveServer2 thrift protocol. In this case, it just play a role forward the user SQL to spark.sql(xxx), I don't think the result will be different with running SQL using spark.sql(xxx) directly, but let me try.

pan3793 commented 2 years ago

@YannByron FYI, after switching to spark.sql(xxx) the SQL failed with same error.

test("hudi 0.10.1") {
    val spark = SparkSession.builder()
      .config("spark.sql.catalogImplementation", "in-memory")
      .config("spark.sql.defaultCatalog", "spark_catalog")
      .config("spark.sql.extensions", "org.apache.spark.sql.hudi.HoodieSparkSessionExtension")
      .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
      .getOrCreate()

    spark.sql(
      s"""
         | create table hudi_tbl (
         |  id int,
         |  name string,
         |  price double,
         |  ts long
         | ) using hudi
         | options (
         |   primaryKey = 'id',
         |   preCombineField = 'ts',
         |   hoodie.bootstrap.index.class =
         |   'org.apache.hudi.common.bootstrap.index.NoOpBootstrapIndex'
         | )
       """.stripMargin)
  }
Stacktrace ``` An exception or error caused a run to abort: org.apache.spark.sql.catalyst.catalog.CatalogTable.copy(Lorg/apache/spark/sql/catalyst/TableIdentifier;Lorg/apache/spark/sql/catalyst/catalog/CatalogTableType;Lorg/apache/spark/sql/catalyst/catalog/CatalogStorageFormat;Lorg/apache/spark/sql/types/StructType;Lscala/Option;Lscala/collection/Seq;Lscala/Option;Ljava/lang/String;JJLjava/lang/String;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/collection/Seq;ZZLscala/collection/immutable/Map;)Lorg/apache/spark/sql/catalyst/catalog/CatalogTable; java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.CatalogTable.copy(Lorg/apache/spark/sql/catalyst/TableIdentifier;Lorg/apache/spark/sql/catalyst/catalog/CatalogTableType;Lorg/apache/spark/sql/catalyst/catalog/CatalogStorageFormat;Lorg/apache/spark/sql/types/StructType;Lscala/Option;Lscala/collection/Seq;Lscala/Option;Ljava/lang/String;JJLjava/lang/String;Lscala/collection/immutable/Map;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/collection/Seq;ZZLscala/collection/immutable/Map;)Lorg/apache/spark/sql/catalyst/catalog/CatalogTable; at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand$.createTableInCatalog(CreateHoodieTableCommand.scala:136) at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.run(CreateHoodieTableCommand.scala:71) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79) at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685) at org.apache.spark.sql.Dataset.(Dataset.scala:228) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.$anonfun$new$1(SparkHudiOperationSuite.scala:56) at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) at org.scalatest.Transformer.apply(Transformer.scala:20) at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226) at org.apache.kyuubi.KyuubiFunSuite.withFixture(KyuubiFunSuite.scala:63) at org.apache.kyuubi.KyuubiFunSuite.withFixture$(KyuubiFunSuite.scala:57) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.withFixture(SparkHudiOperationSuite.scala:27) at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224) at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236) at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236) at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkHudiOperationSuite.scala:27) at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234) at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.runTest(SparkHudiOperationSuite.scala:27) at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269) at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413) at scala.collection.immutable.List.foreach(List.scala:431) at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401) at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396) at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475) at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269) at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268) at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563) at org.scalatest.Suite.run(Suite.scala:1112) at org.scalatest.Suite.run$(Suite.scala:1094) at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563) at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273) at org.scalatest.SuperEngine.runImpl(Engine.scala:535) at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273) at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkHudiOperationSuite.scala:27) at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213) at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210) at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208) at org.apache.kyuubi.engine.spark.operation.SparkHudiOperationSuite.run(SparkHudiOperationSuite.scala:27) at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45) at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322) at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316) at scala.collection.immutable.List.foreach(List.scala:431) at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971) at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482) at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971) at org.scalatest.tools.Runner$.run(Runner.scala:798) at org.scalatest.tools.Runner.run(Runner.scala) at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38) at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25) ```
codope commented 2 years ago

Looks like the test got fixed after upgrade to 0.11.0. https://github.com/apache/incubator-kyuubi/commit/cb5f49e3e9bf4afed100e756302b69879faf5e61 Please reopen if you still see the issue.