apache / iceberg

Apache Iceberg
https://iceberg.apache.org/
Apache License 2.0
5.85k stars 2.06k forks source link

Hive: Got NoSuchMethodException when create HiveMetaStoreClient in hive-metastore-2.1.1. #3363

Closed Reo-LEI closed 2 years ago

Reo-LEI commented 2 years ago

Since https://github.com/apache/iceberg/pull/3099 I encounter the NoSuchMethodException when I submit flink job to flink cluster which hive metastore version is 2.1.1.

I found in #3099, HiveClientPool will dynamically call RetryingMetaStoreClient.getProxy(HiveConf.class, Boolean.TYPE) methon and the getProxy will call HiveMetaStoreClient(HiveConf.class, Boolean.class) constructor, but the HiveMetaStoreClient don't have this constructor. Finally the NoSuchMethodException raised.

I think we should find a way to be compatible with this. @szehon-ho @rdblue @pvary @jackye1995 @marton-bod

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Unable to create a sink for writing table 'default_catalog.default_database.table_name'.

Table options are:
...

    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:366) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:219) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:814) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1056) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1134) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_112]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754) [flink-shaded-hadoop-2-uber-2.7.5-7.0.jar:2.7.5-7.0]
    at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) [flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1134) [flink-dist_2.12-1.12.1.jar:1.12.1]
Caused by: org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.ods_mysql_hive_exec_job_rewrite'.

Table options are:

...

    at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:156) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156) ~[?:?]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    ... 11 more
Caused by: org.apache.iceberg.hive.RuntimeMetaException: Failed to connect to Hive Metastore
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:73) ~[?:?]
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
    at org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
    at org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112) ~[?:?]
    at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156) ~[?:?]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    ... 11 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:83) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.iceberg.common.DynMethods$UnboundMethod.invokeChecked(DynMethods.java:65) ~[?:?]
    at org.apache.iceberg.common.DynMethods$UnboundMethod.invoke(DynMethods.java:77) ~[?:?]
    at org.apache.iceberg.common.DynMethods$StaticMethod.invoke(DynMethods.java:196) ~[?:?]
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:56) ~[?:?]
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
    at org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
    at org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112) ~[?:?]
    at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156) ~[?:?]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    ... 11 more
Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(org.apache.hadoop.hive.conf.HiveConf, java.lang.Boolean)
    at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_112]
    at java.lang.Class.getDeclaredConstructor(Class.java:2178) ~[?:1.8.0_112]
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1650) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:83) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[?:?]
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.iceberg.common.DynMethods$UnboundMethod.invokeChecked(DynMethods.java:65) ~[?:?]
    at org.apache.iceberg.common.DynMethods$UnboundMethod.invoke(DynMethods.java:77) ~[?:?]
    at org.apache.iceberg.common.DynMethods$StaticMethod.invoke(DynMethods.java:196) ~[?:?]
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:56) ~[?:?]
    at org.apache.iceberg.hive.HiveClientPool.newClient(HiveClientPool.java:33) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.get(ClientPoolImpl.java:125) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:56) ~[?:?]
    at org.apache.iceberg.ClientPoolImpl.run(ClientPoolImpl.java:51) ~[?:?]
    at org.apache.iceberg.hive.CachedClientPool.run(CachedClientPool.java:76) ~[?:?]
    at org.apache.iceberg.hive.HiveCatalog.loadNamespaceMetadata(HiveCatalog.java:386) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.getDatabase(FlinkCatalog.java:173) ~[?:?]
    at org.apache.iceberg.flink.FlinkCatalog.databaseExists(FlinkCatalog.java:185) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createTableLoader(FlinkDynamicTableFactory.java:163) ~[?:?]
    at org.apache.iceberg.flink.FlinkDynamicTableFactory.createDynamicTableSink(FlinkDynamicTableFactory.java:112) ~[?:?]
    at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:153) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:369) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:221) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:159) ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:676) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:767) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666) ~[flink-table_2.12-1.12.1.jar:1.12.1]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.executeSQL(FlinkSQLSubmitter.java:156) ~[?:?]
    at com.huya.dc.walrus.lakehouse.flink.sql.FlinkSQLSubmitter.main(FlinkSQLSubmitter.java:112) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349) ~[flink-dist_2.12-1.12.1.jar:1.12.1]
    ... 11 more
szehon-ho commented 2 years ago

@pvary is this a Hive problem in 2.1.1?

Seems we make a valid RetryingHiveMetastoreClient.getProxy call (ref: https://github.com/apache/hive/blob/rel/release-2.1.1/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java#L86). Then it internally it calls a method in HiveMetastoreClient that does not exist in this version? That's strange.

Maybe I can switch to call: https://github.com/apache/hive/blob/rel/release-2.1.1/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java#L97

szehon-ho commented 2 years ago

Its unfortunate, it seems its caused by a bad Hive release in 2.1.1 from this change: https://issues.apache.org/jira/browse/HIVE-12918.

Its fixed by https://issues.apache.org/jira/browse/HIVE-15081 in Hive 2.3 branch but its too late for 2.1 users.

szehon-ho commented 2 years ago

Let me see if there's a common API that works across both version

Reo-LEI commented 2 years ago

@szehon-ho Thanks for your fix! I will test the PR later. 😄

pvary commented 2 years ago

Thanks @Reo-LEI for reporting and @szehon-ho for fixing the issue!