trinodb / trino

Official repository of Trino, the distributed SQL query engine for big data, formerly known as PrestoSQL (https://trino.io)
https://trino.io
Apache License 2.0
10.46k stars 3.01k forks source link

`Table being modified concurrently` happens in Glue tests #13199

Closed ebyhr closed 1 year ago

ebyhr commented 2 years ago
tests               | 2022-07-16 00:57:47 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 5.2 seconds
tests               | 2022-07-16 00:57:47 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: Query failed (#20220715_191246_00202_n5fs3): Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: acd0498f-ab5e-4587-8a4f-1dca96620063; Proxy: null)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:119)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
tests               |   at io.trino.tests.product.utils.QueryExecutors$1.lambda$executeQuery$0(QueryExecutors.java:60)
tests               |   at net.jodah.failsafe.Functions.lambda$get$0(Functions.java:48)
tests               |   at net.jodah.failsafe.RetryPolicyExecutor.lambda$supply$0(RetryPolicyExecutor.java:62)
tests               |   at net.jodah.failsafe.Execution.executeSync(Execution.java:129)
tests               |   at net.jodah.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
tests               |   at net.jodah.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:67)
tests               |   at io.trino.tests.product.utils.QueryExecutors$1.executeQuery(QueryExecutors.java:60)
tests               |   at io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn(TestDeltaLakeAlterTableCompatibility.java:144)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
tests               |   at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
tests               |   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
tests               |   at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
tests               |   at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
tests               |   at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
tests               |   at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
tests               |   at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
tests               |   at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
tests               |   at java.base/java.lang.Thread.run(Thread.java:833)
tests               | Caused by: java.sql.SQLException: Query failed (#20220715_191246_00202_n5fs3): Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: acd0498f-ab5e-4587-8a4f-1dca96620063; Proxy: null)
tests               |   at io.trino.jdbc.AbstractTrinoResultSet.resultsException(AbstractTrinoResultSet.java:1937)
tests               |   at io.trino.jdbc.TrinoResultSet.getColumns(TrinoResultSet.java:285)
tests               |   at io.trino.jdbc.TrinoResultSet.create(TrinoResultSet.java:61)
tests               |   at io.trino.jdbc.TrinoStatement.internalExecute(TrinoStatement.java:262)
tests               |   at io.trino.jdbc.TrinoStatement.execute(TrinoStatement.java:240)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:128)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
tests               |   ... 22 more
tests               |   Suppressed: java.lang.Exception: Query: DROP TABLE delta.default.test_dl_comment_column_1ni3sfrfyqeu
tests               |       at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:136)
tests               |       ... 23 more
tests               | Caused by: io.trino.spi.TrinoException: Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: acd0498f-ab5e-4587-8a4f-1dca96620063; Proxy: null)
tests               |   at io.trino.plugin.hive.metastore.glue.GlueHiveMetastore.dropTable(GlueHiveMetastore.java:583)
tests               |   at io.trino.plugin.hive.metastore.cache.CachingHiveMetastore.dropTable(CachingHiveMetastore.java:512)
tests               |   at io.trino.plugin.deltalake.metastore.HiveMetastoreBackedDeltaLakeMetastore.dropTable(HiveMetastoreBackedDeltaLakeMetastore.java:167)
tests               |   at io.trino.plugin.deltalake.DeltaLakeMetadata.dropTable(DeltaLakeMetadata.java:1765)
tests               |   at io.trino.plugin.base.classloader.ClassLoaderSafeConnectorMetadata.dropTable(ClassLoaderSafeConnectorMetadata.java:390)
tests               |   at io.trino.metadata.MetadataManager.dropTable(MetadataManager.java:739)
tests               |   at io.trino.execution.DropTableTask.execute(DropTableTask.java:89)
tests               |   at io.trino.execution.DropTableTask.execute(DropTableTask.java:37)
tests               |   at io.trino.execution.DataDefinitionExecution.start(DataDefinitionExecution.java:145)
tests               |   at io.trino.execution.SqlQueryManager.createQuery(SqlQueryManager.java:249)
tests               |   at io.trino.dispatcher.LocalDispatchQuery.lambda$startExecution$7(LocalDispatchQuery.java:143)
tests               |   at io.trino.$gen.Trino_390_13_gb3d38c3____20220715_18[5147](https://github.com/trinodb/trino/runs/7362895993?check_suite_focus=true#step:8:5148)_2.run(Unknown Source)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
tests               |   at java.base/java.lang.Thread.run(Thread.java:833)
tests               | Caused by: com.amazonaws.services.glue.model.ConcurrentModificationException: Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: acd0498f-ab5e-4587-8a4f-1dca96620063; Proxy: null)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1862)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1415)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1384)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1154)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:811)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:779)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:753)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:713)
tests               |   at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:695)
tests               |   at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:559)
tests               |   at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:539)
tests               |   at com.amazonaws.services.glue.AWSGlueClient.doInvoke(AWSGlueClient.java:11444)
tests               |   at com.amazonaws.services.glue.AWSGlueClient.invoke(AWSGlueClient.java:11411)
tests               |   at com.amazonaws.services.glue.AWSGlueClient.invoke(AWSGlueClient.java:11400)
tests               |   at com.amazonaws.services.glue.AWSGlueClient.executeDeleteTable(AWSGlueClient.java:3688)
tests               |   at com.amazonaws.services.glue.AWSGlueClient.deleteTable(AWSGlueClient.java:3657)
tests               |   at io.trino.plugin.hive.metastore.glue.GlueHiveMetastore.lambda$dropTable$19(GlueHiveMetastore.java:578)
tests               |   at io.trino.plugin.hive.metastore.glue.GlueMetastoreApiStats.call(GlueMetastoreApiStats.java:35)
tests               |   at io.trino.plugin.hive.metastore.glue.GlueHiveMetastore.dropTable(GlueHiveMetastore.java:577)
tests               |   ... 14 more

https://github.com/trinodb/trino/runs/7362895993

ebyhr commented 2 years ago

The same error happened in TestDeltaLakeDatabricksInsertCompatibility.testCompression https://github.com/trinodb/trino/runs/7552794871

ebyhr commented 2 years ago

TestDeltaLakeDropTableCompatibility.testDropTable https://github.com/trinodb/trino/runs/7461160182

ebyhr commented 2 years ago

TestDeltaLakeDropTableCompatibility.testDropTable https://github.com/trinodb/trino/runs/7571034833

ebyhr commented 2 years ago

TestDeltaLakeDatabricksCreateTableCompatibility.testCreateTableWithColumnCommentOnTrino https://github.com/trinodb/trino/runs/7574003052

ebyhr commented 2 years ago

I could reproduce this issue locally. Investigating.

findepi commented 2 years ago

@ebyhr are your repro steps something you can share here?

ebyhr commented 2 years ago

I set invocationCount = 100, threadPoolSize = 10 in TestDeltaLakeDatabricksCreateTableCompatibility#testCreateTableWithColumnCommentOnTrino. The rate of failure is about 1 or 2%.

ebyhr commented 2 years ago

The current status: asking AWS support team.

ebyhr commented 2 years ago

TestDeltaLakeDatabricksCreateTableCompatibility.testCreateTableWithColumnCommentOnTrino https://github.com/trinodb/trino/runs/7719217375

ebyhr commented 2 years ago

TestDeltaLakeDatabricksInsertCompatibility.testCompression https://github.com/trinodb/trino/runs/7760288355

ebyhr commented 2 years ago

TestDeltaLakeDatabricksInsertCompatibility.testCompression https://github.com/trinodb/trino/runs/8232914632

findinpath commented 2 years ago

https://github.com/trinodb/trino/actions/runs/3336163846/jobs/5524119463

ebyhr commented 2 years ago

https://github.com/trinodb/trino/actions/runs/3413120185/jobs/5680402178

ebyhr commented 2 years ago

https://github.com/trinodb/trino/actions/runs/3422852633/jobs/5702336869

findinpath commented 1 year ago

https://github.com/trinodb/trino/actions/runs/3495735811/jobs/5853835535

tests               | 2022-11-18 18:23:48 INFO: FAILURE     /    
io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName (Groups: profile_specific_tests, delta-lake-exclude-91, delta-lake-databricks, delta-lake-oss, delta-lake-exclude-73) took 9.7 seconds

https://github.com/trinodb/trino/actions/runs/3495735811/jobs/5853835644

tests               | 2022-11-18 17:46:13 INFO: FAILURE     /    
io.trino.tests.product.deltalake.TestDeltaLakeDatabricksCreateTableCompatibility.testCreateTableWithColumnCommentOnTrino (Groups: profile_specific_tests, delta-lake-databricks) took 6.2 seconds
findinpath commented 1 year ago

https://github.com/trinodb/trino/actions/runs/3518724371/jobs/5899508450

2022-11-22 09:24:28 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: d935416a-1e3a-48d7-ae4f-603a1d7d6eaf; Proxy: null))
....
Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: d935416a-1e3a-48d7-ae4f-603a1d7d6eaf; Proxy: null))
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
tests               |   at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
tests               |   at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
tests               |   at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
tests               |   ... 90 more
tests               | , Query: DROP TABLE default.test_dl_unsupported_column_mapping_mode_l1gen4kso6.
image

All the versions are created on AWS Glue at November 22, 2022 at 03:39:28 (UTC) - maybe trying to drop the table while Glue is internally conflicting with the versioning mechanism ???

ebyhr commented 1 year ago

@dennyglee Hi, Delta Lake connector has suffered from flaky tests since July. We observed that Databricks calls glue:UpdateTable in the background and the operations led to concurrent issue when dropping the table in our tests. Is it possible to get help from Databricks? I can share more details including the response body if you want.

dennyglee commented 1 year ago

Hi @ebyhr - yes it is. Do you have a support contract with Databricks? If not, please ping me directly (I'm on both Trino and Delta Lake slack) and I can help with this, eh?!

findinpath commented 1 year ago

https://github.com/trinodb/trino/actions/runs/3623643550/jobs/6110579877

io.trino.tests.product.deltalake.TestDeltaLakeDatabricksInsertCompatibility.testCompressionWithOptimizedWriter(TestDeltaLakeDatabricksInsertCompatibility.java:384)

ebyhr commented 1 year ago

https://github.com/trinodb/trino/actions/runs/3701471005/jobs/6270913606

tests               | 2022-12-15 12:34:49 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testRenameColumn (Groups: profile_specific_tests, delta-lake-exclude-91, delta-lake-databricks, delta-lake-oss, delta-lake-exclude-73) took 9.0 seconds
tests               | 2022-12-15 12:34:49 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: a04efa17-e006-4bc1-84f1-fe9d027cd253; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:103)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
tests               |   at java.security.AccessController.doPrivileged(Native Method)
tests               |   at javax.security.auth.Subject.doAs(Subject.java:422)
tests               |   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
tests               |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
tests               |   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
tests               |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
tests               |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
tests               |   at java.lang.Thread.run(Thread.java:750)
tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: a04efa17-e006-4bc1-84f1-fe9d027cd253; Proxy: null))
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
tests               |   at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
tests               |   at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
tests               |   at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
tests               |   at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
tests               |   at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
tests               |   at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
tests               |   at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:241)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
tests               |   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:241)
tests               |   at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:226)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:239)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:232)
tests               |   at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
tests               |   at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
tests               |   at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
tests               |   at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
tests               |   at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:232)
tests               |   at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
tests               |   at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:232)
tests               |   at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:186)
tests               |   at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:177)
tests               |   at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
tests               |   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
tests               |   at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:691)
tests               |   at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
tests               |   at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
tests               |   ... 19 more
tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: a04efa17-e006-4bc1-84f1-fe9d027cd253; Proxy: null))
tests               |   at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
tests               |   at sun.reflect.GeneratedMethodAccessor418.invoke(Unknown Source)
tests               |   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
tests               |   at java.lang.reflect.Method.invoke(Method.java:498)
tests               |   at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1372)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
tests               |   at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
tests               |   ... 70 more
tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: a04efa17-e006-4bc1-84f1-fe9d027cd253; Proxy: null))
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
tests               |   at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
tests               |   at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
tests               |   at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
tests               |   ... 90 more
tests               | , Query: DROP TABLE default.test_dl_rename_column_mo3zp8830e.
ebyhr commented 1 year ago

TestDeltaLakeDatabricksInsertCompatibility > testMetadataOperationsRetainCheckConstraints https://github.com/trinodb/trino/runs/10170954000

ebyhr commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4039498906/jobs/6944434840 https://github.com/trinodb/trino/actions/runs/4039498906/jobs/6944434909

ebyhr commented 1 year ago
tests               | 2023-01-30 12:00:41 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 6.4 seconds

https://github.com/trinodb/trino/actions/runs/4040936189/jobs/6947155070

homar commented 1 year ago

Happened again here https://github.com/trinodb/trino/pull/15453 2023-02-08 10:48:18 INFO: FAILURE / io.trino.tests.product.deltalake.TestDeltaLakeDropTableCompatibility.testDropTable [TRINO, DELTA, false]

ebyhr commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4141068942/jobs/7164352124

io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 3.4 seconds
krvikash commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4158597848/jobs/7194012791

io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 5.1 seconds
findinpath commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4141068942/jobs/7164352124

2023-02-10 19:39:27 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testCommentOnColumn (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 3.4 seconds
tests               | 2023-02-10 19:39:27 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: Query failed (#20230210_135426_00183_79ntm): Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: b8c131f0-ba3e-4f85-b989-ca2e966dfff6; Proxy: null)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:119)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
ebyhr commented 1 year ago
io.trino.tests.product.deltalake.TestDeltaLakeDatabricksCreateTableCompatibility.testCreateTableWithColumnCommentOnTrino (Groups: profile_specific_tests, delta-lake-databricks) took 4.5 seconds
io.trino.tests.product.deltalake.TestDeltaLakeDatabricksInsertCompatibility.testCompression [SNAPPY] (Groups: profile_specific_tests, delta-lake-databricks) took 5.0 seconds

https://github.com/trinodb/trino/actions/runs/4161013002/jobs/7199251252

pajaks commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4204417729

io.trino.tests.product.deltalake.TestDeltaLakeAlterTableCompatibility.testAddColumnWithCommentOnTrino
io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName
findepi commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4289597275/jobs/7473173014 (run of https://github.com/trinodb/trino/pull/14891 with secrets)

2023-02-28T06:35:19.5037892Z tests               | 2023-02-28 12:20:19 WARNING: not retrying; stacktrace does not match pattern '\Q[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP retry after response received with no Retry-After header, error: HTTP Response code: 503, Error message: Unknown.': [io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5040186Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.5041278Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.5042153Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5042800Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.5043850Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.5045131Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.5046014Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5046898Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.5048010Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.5049200Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5050701Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.5051809Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.5052681Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.5053400Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.5054111Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.5055079Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.5055939Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.5056580Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.5057272Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.5058023Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.5058607Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.5060188Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5061382Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.5062252Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
2023-02-28T06:35:19.5063116Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.5063944Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.5064836Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.5065698Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.5066536Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.5067492Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.5068495Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.5069486Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.5070461Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.5071276Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.5072175Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.5073162Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.5074732Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.5169086Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.5170240Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.5171074Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.5171998Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.5172765Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5173533Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.5174313Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.5175229Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.5176219Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.5177221Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.5178179Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.5179067Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.5179862Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.5180680Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.5181724Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5182832Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.5183923Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.5184972Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5186006Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5186900Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.5187726Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.5188670Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.5189632Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.5190549Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.5191428Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.5192205Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.5193086Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.5194029Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5194949Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5196188Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.5264310Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5265543Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5266283Z tests               |  at org.apache.
2023-02-28T06:35:19.5266733Z tests               | spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.5267475Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.5268387Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.5269470Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5270631Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5271342Z tests               |  ... 19 more
2023-02-28T06:35:19.5272824Z tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5273882Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.5320199Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.5321008Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5321713Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.5322338Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.5323099Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5323827Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5324568Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.5325381Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.5326237Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.5327113Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.5327936Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.5328772Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5330633Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.5331519Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5332398Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.5333399Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5334232Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.5334960Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5335639Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.5336397Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.5336959Z tests               |  ... 70 more
2023-02-28T06:35:19.5338206Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5339323Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.5340404Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.5341493Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.5342536Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.5343399Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.5343919Z tests               |  ... 90 more
2023-02-28T06:35:19.5344446Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.5345161Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:119)
2023-02-28T06:35:19.5346017Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.5346962Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.5347648Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5348257Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5348939Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5349694Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5350362Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5351002Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5351682Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.5352567Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5353361Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5354041Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5354747Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5355484Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5356168Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5357012Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5357882Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5359209Z tests               |  at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.5360490Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.5361252Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.5362107Z tests               |  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5362800Z tests               |  at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.5363483Z tests               |  at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.5364171Z tests               |  at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.5364753Z tests               |  at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.5365365Z tests               |  at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.5366220Z tests               |  at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.5366956Z tests               |  at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.5367695Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.5368479Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.5369102Z tests               |  at java.base/java.lang.Thread.run(Thread.java:833)
2023-02-28T06:35:19.5371277Z tests               | Caused by: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5372944Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.5373993Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.5374881Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5375506Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.5376317Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thrift
2023-02-28T06:35:19.5377182Z tests               | server$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.5378253Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.5379200Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5380073Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.5381259Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.5382449Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5383659Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.5384752Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.5385621Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.5386227Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.5386942Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.5387902Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.5388778Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.5389412Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.5479099Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.5479902Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.5480497Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.5482164Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5483384Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.5484262Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
2023-02-28T06:35:19.5485134Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.5485968Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.5486864Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.5487733Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.5488564Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.5489509Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.5490622Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.5492004Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.5492979Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.5493783Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.5494792Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.5495803Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.5496756Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.5497752Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.5498644Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.5499460Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.5500260Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.5501007Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5501749Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.5502523Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.5503429Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.5504417Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.5505404Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.5506358Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.5507253Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.5508027Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.5508859Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.5509906Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5511020Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.5512120Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.5513163Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5514195Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5515154Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.5515993Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.5597482Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.5599089Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.5600042Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.5600928Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.5601644Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.5602530Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.5603461Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5604388Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5605576Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.5606771Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5607942Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5608795Z tests               |  at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.5609549Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.5610475Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.5611530Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5612709Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5613414Z tests               |  ... 19 more
2023-02-28T06:35:19.5614874Z tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5615904Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.5616571Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.5617335Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5618176Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.5618808Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.5619548Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5620269Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5621123Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.5621946Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.5622791Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.5623748Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.5624576Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.5625407Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5626250Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.5627103Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5628001Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.5628872Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5629711Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.5630423Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5631094Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.5631841Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.5632411Z tests               |  ... 70 more
2023-02-28T06:35:19.5633481Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5634577Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.5636247Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.5718040Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.5719068Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.5719885Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.5720366Z tests               |  ... 90 more
2023-02-28T06:35:19.5720864Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.5721662Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.buildExceptionFromTStatusSqlState(Unknown Source)
2023-02-28T06:35:19.5722574Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.pollForOperationCompletion(Unknown Source)
2023-02-28T06:35:19.5723439Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.executeStatementInternal(Unknown Source)
2023-02-28T06:35:19.5724251Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.executeStatement(Unknown Source)
2023-02-28T06:35:19.5725208Z tests               |  at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.executeRowCountQueryHelper(Unknown Source)
2023-02-28T06:35:19.5726600Z tests               |  at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.execute(Unknown Source)
2023-02-28T06:35:19.5727430Z tests               |  at com.databricks.client.jdbc.common.SStatement.executeNoParams(Unknown Source)
2023-02-28T06:35:19.5728155Z tests               |  at com.databricks.client.jdbc.common.BaseStatement.execute(Unknown Source)
2023-02-28T06:35:19.5728974Z tests               |  at com.databricks.client.hivecommon.jdbc42.Hive42Statement.execute(Unknown Source)
2023-02-28T06:35:19.5729756Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:128)
2023-02-28T06:35:19.5730494Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
2023-02-28T06:35:19.5731218Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.5732666Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.5733956Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5734551Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5735280Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5736052Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5736720Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5738301Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5738992Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.5739883Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5740679Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5741281Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5741968Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5742731Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5743398Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5744033Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5744883Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5746224Z tests               |  at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.5747390Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.5748206Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.5749128Z tests               |  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5749874Z tests               |  at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.5750612Z tests               |  at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.5751333Z tests               |  at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.5752159Z tests               |  at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.5752817Z tests               |  at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.5753549Z tests               |  at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.5754269Z tests               |  at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.5755103Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.5755883Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.5756671Z tests               |  Suppressed: java.lang.Exception: Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk
2023-02-28T06:35:19.5798583Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:136)
2023-02-28T06:35:19.5799419Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
2023-02-28T06:35:19.5800189Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.5800975Z tests               |      at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.5801688Z tests               |      at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5802278Z tests               |      at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5803062Z tests               |      at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5803754Z tests               |      at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5804397Z tests               |      at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5804984Z tests               |      at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5805631Z tests               |      at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.5806456Z tests               |      at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5807196Z tests               |      at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.5807767Z tests               |      at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.5808396Z tests               |      at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.5809104Z tests               |      at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.5809720Z tests               |      at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.5810314Z tests               |      at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.5811103Z tests               |      at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.5812355Z tests               |      at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.5813428Z tests               |      at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.5814202Z tests               |      at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.5815571Z tests               |      at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5816335Z tests               |      at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.5817073Z tests               |      at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.5817795Z tests               |      at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.5818541Z tests               |      at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.5819191Z tests               |      at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.5819927Z tests               |      at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.5820644Z tests               |      at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.5821393Z tests               |      at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.5822170Z tests               |      at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.5822808Z tests               |      at java.base/java.lang.Thread.run(Thread.java:833)
2023-02-28T06:35:19.5825328Z tests               | Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5827110Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.5828187Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.5829056Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5829689Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.5830732Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.5832010Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.5832901Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5833797Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.5834908Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.5836095Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5837452Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.5838558Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.5839431Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.5840137Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.5840847Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.5841816Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.5842751Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.5843393Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.5844080Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.5844833Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.5845420Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.5846936Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5848137Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.5848992Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCa
2023-02-28T06:35:19.5849586Z tests               | talog.scala:115)
2023-02-28T06:35:19.5850193Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.5851041Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.5851926Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.5852800Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.5853618Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.5854582Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.5855569Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.5856579Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.5857548Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.5858368Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.5859280Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.5860265Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.5861233Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.5862201Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.5863110Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.5863982Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.5864799Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.5865549Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5866296Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.5867154Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.5868040Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.5869055Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.5870041Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.5871004Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.5952991Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.5953811Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.5954655Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.5955705Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5957156Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.5958260Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.5959340Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5960349Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.5961260Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.5962100Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.5963033Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.5964002Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.5964914Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.5965884Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.5966536Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.5967359Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.5968385Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.5969658Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5970850Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.5972060Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.5973335Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5974167Z tests               |  at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.5974936Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.5975844Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.5976922Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.5978084Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.5978813Z tests               |  ... 19 more
2023-02-28T06:35:19.5980272Z tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.5981299Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.5981983Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.5982731Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.5983422Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.5984038Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.5984793Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5985510Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5986244Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.5987045Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.5987906Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.5988785Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.5989608Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.5990442Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.5991264Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.5992132Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5993083Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.5993891Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTab
2023-02-28T06:35:19.5994471Z tests               | le(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.5995133Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.5995916Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.5996589Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.5997501Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.5998050Z tests               |  ... 70 more
2023-02-28T06:35:19.5999148Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6000231Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.6001325Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.6002409Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.6003458Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.6004305Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.6004819Z tests               |  ... 90 more
2023-02-28T06:35:19.6005341Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.6005836Z tests               |  ... 41 more
2023-02-28T06:35:19.6006165Z tests               | ]
2023-02-28T06:35:19.6007747Z tests               | 2023-02-28 12:20:19 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName [id] (Groups: profile_specific_tests, delta-lake-exclude-91, delta-lake-databricks, delta-lake-oss, delta-lake-exclude-73) took 13.7 seconds
2023-02-28T06:35:19.6009042Z tests               | 2023-02-28 12:20:19 SEVERE: Failure cause:
2023-02-28T06:35:19.6011249Z tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6012996Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.6014062Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.6014931Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6015565Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.6016605Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.6017976Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.6018847Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6019733Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.6021002Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.6022099Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6023209Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.6024477Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.6025350Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.6025954Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.6026671Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.6027624Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.6028490Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.6029133Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.6029820Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.6030574Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.6031160Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.6032647Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6033822Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.6034693Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
2023-02-28T06:35:19.6035542Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.6036389Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.6038472Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.6039344Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.6040185Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.6041137Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.6042147Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.6043276Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.6044257Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.6045063Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.6046057Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.6047049Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.6048021Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.6049055Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.6049962Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.6050781Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.6051584Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.6052337Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6053079Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.6053863Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.6054748Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.6055760Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.6056733Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.6057381Z tests               |  at org.apache.spark
2023-02-28T06:35:19.6058029Z tests               | .sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.6058968Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.6059700Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.6060454Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.6061418Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6062702Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.6063786Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.6064852Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6065936Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6066856Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.6067679Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.6068617Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.6075061Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.6076176Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.6077340Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.6078062Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.6078963Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.6079871Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6080805Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6081979Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.6083203Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6084361Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6085213Z tests               |  at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.6085974Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.6086880Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.6087953Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6089104Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6089822Z tests               |  ... 19 more
2023-02-28T06:35:19.6091323Z tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6092380Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.6093051Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.6093844Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6094536Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.6095155Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.6095904Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6096873Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6097618Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.6098417Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.6099379Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.6100531Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.6101376Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.6102184Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6103025Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.6103889Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6104762Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.6105652Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6106486Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.6107211Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6107864Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.6108632Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.6109185Z tests               |  ... 70 more
2023-02-28T06:35:19.6130127Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6131386Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.6132493Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.6133580Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.6134643Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.6135496Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.6136022Z tests               |  ... 90 more
2023-02-28T06:35:19.6136551Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.6137265Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:119)
2023-02-28T06:35:19.6138050Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.6138825Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.6139534Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6140126Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6141091Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6141845Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6142539Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6143162Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6143946Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.6144819Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6145637Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6146222Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6146927Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6147679Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6149088Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6151610Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6152478Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6153827Z tests               |  at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.6154986Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.6155824Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.6156727Z tests               |  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6157644Z tests               |  at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.6158475Z tests               |  at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.6159175Z tests               |  at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.6159774Z tests               |  at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.6160569Z tests               |  at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.6161298Z tests               |  at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.6162019Z tests               |  at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.6162765Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.6163535Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.6164173Z tests               |  at java.base/java.lang.Thread.run(Thread.java:833)
2023-02-28T06:35:19.6166633Z tests               | Caused by: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6168420Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.6169498Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.6170465Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6171118Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.6172163Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.6173454Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.6174327Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6175220Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.6176329Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.6177518Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6178723Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.6179816Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.6180693Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.6181300Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.6182011Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.6182968Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.6183837Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.6184478Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.6185162Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.6185914Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.6186504Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.6188004Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6189189Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.6190076Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
2023-02-28T06:35:19.6190932Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.6191863Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.6192759Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.6193623Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.6194523Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.6195482Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.6196594Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.6197739Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.6198655Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.6199409Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.6200433Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.6201454Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.6202417Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.6203408Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.6204320Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.6205134Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.6205928Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.6206680Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6207420Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.6208186Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.6209086Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.6210069Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.6211069Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.6212030Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.6212914Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.6213681Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.6214514Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.6215658Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6216783Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.6217971Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.6219100Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6220302Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6221196Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.6222035Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.6222959Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.6223924Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.6224852Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.6225724Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.6226433Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.6227311Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.6228225Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6229150Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6230335Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.6231537Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6232705Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6233555Z tests               |  at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.6234307Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.6235222Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.6236275Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6237572Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6238273Z tests               |  ... 19 more
2023-02-28T06:35:19.6239573Z tests               | Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6240709Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.6241385Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.6242146Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6244367Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.6245007Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.6248970Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6249714Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6250444Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.6251273Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.6252121Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.6252999Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.6253826Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.6254669Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6255508Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.6256372Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6257281Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.6258159Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6258996Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.6259714Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6260382Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.6261134Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.6261697Z tests               |  ... 70 more
2023-02-28T06:35:19.6262817Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6263928Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.6265010Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.6266168Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.6267218Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.6268059Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.6268752Z tests               |  ... 90 more
2023-02-28T06:35:19.6269288Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.6270151Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.buildExceptionFromTStatusSqlState(Unknown Source)
2023-02-28T06:35:19.6271121Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.pollForOperationCompletion(Unknown Source)
2023-02-28T06:35:19.6272120Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.executeStatementInternal(Unknown Source)
2023-02-28T06:35:19.6272973Z tests               |  at com.databricks.client.hivecommon.api.HS2Client.executeStatement(Unknown Source)
2023-02-28T06:35:19.6274036Z tests               |  at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.executeRowCountQueryHelper(Unknown Source)
2023-02-28T06:35:19.6275158Z tests               |  at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.execute(Unknown Source)
2023-02-28T06:35:19.6276063Z tests               |  at com.databricks.client.jdbc.common.SStatement.executeNoParams(Unknown Source)
2023-02-28T06:35:19.6276621Z tests               |  at com
2023-02-28T06:35:19.6277417Z tests               | .databricks.client.jdbc.common.BaseStatement.execute(Unknown Source)
2023-02-28T06:35:19.6278183Z tests               |  at com.databricks.client.hivecommon.jdbc42.Hive42Statement.execute(Unknown Source)
2023-02-28T06:35:19.6279019Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:128)
2023-02-28T06:35:19.6279825Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
2023-02-28T06:35:19.6280684Z tests               |  at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.6281412Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.6282047Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6282813Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6283500Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6284261Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6284933Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6285572Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6286255Z tests               |  at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.6287147Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6287944Z tests               |  at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6288554Z tests               |  at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6289251Z tests               |  at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6290003Z tests               |  at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6290699Z tests               |  at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6291329Z tests               |  at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6292199Z tests               |  at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6293623Z tests               |  at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.6294800Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.6295713Z tests               |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.6296855Z tests               |  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6297598Z tests               |  at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.6298841Z tests               |  at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.6299588Z tests               |  at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.6300227Z tests               |  at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.6300884Z tests               |  at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.6301612Z tests               |  at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.6302352Z tests               |  at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.6303093Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.6303883Z tests               |  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.6304661Z tests               |  Suppressed: java.lang.Exception: Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk
2023-02-28T06:35:19.6305493Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:136)
2023-02-28T06:35:19.6306281Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
2023-02-28T06:35:19.6307047Z tests               |      at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
2023-02-28T06:35:19.6307830Z tests               |      at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
2023-02-28T06:35:19.6308534Z tests               |      at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6309135Z tests               |      at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6309813Z tests               |      at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6310572Z tests               |      at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6311237Z tests               |      at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6311879Z tests               |      at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6312557Z tests               |      at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
2023-02-28T06:35:19.6313445Z tests               |      at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.lambda$dropDeltaTableWithRetry$4(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6314240Z tests               |      at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
2023-02-28T06:35:19.6314924Z tests               |      at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
2023-02-28T06:35:19.6315560Z tests               |      at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
2023-02-28T06:35:19.6316269Z tests               |      at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
2023-02-28T06:35:19.6317198Z tests               |      at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
2023-02-28T06:35:19.6317809Z tests               |      at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
2023-02-28T06:35:19.6318821Z tests               |      at io.trino.tests.product.deltalake.util.DeltaLakeTestUtils.dropDeltaTableWithRetry(DeltaLakeTestUtils.java:87)
2023-02-28T06:35:19.6320167Z tests               |      at io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName(TestDeltaLakeColumnMappingMode.java:360)
2023-02-28T06:35:19.6321421Z tests               |      at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2023-02-28T06:35:19.6322248Z tests               |      at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
2023-02-28T06:35:19.6323171Z tests               |      at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6323931Z tests               |      at java.base/java.lang.reflect.Method.invoke(Method.java:568)
2023-02-28T06:35:19.6324675Z tests               |      at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
2023-02-28T06:35:19.6325397Z tests               |      at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
2023-02-28T06:35:19.6326042Z tests               |      at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
2023-02-28T06:35:19.6326696Z tests               |      at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
2023-02-28T06:35:19.6327434Z tests               |      at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
2023-02-28T06:35:19.6328151Z tests               |      at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
2023-02-28T06:35:19.6328907Z tests               |      at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
2023-02-28T06:35:19.6329699Z tests               |      at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
2023-02-28T06:35:19.6330323Z tests               |      at java.base/java.lang.Thread.run(Thread.java:833)
2023-02-28T06:35:19.6332712Z tests               | Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6334505Z tests               |  at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
2023-02-28T06:35:19.6335565Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
2023-02-28T06:35:19.6336449Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6337072Z tests               |  at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
2023-02-28T06:35:19.6338124Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
2023-02-28T06:35:19.6339397Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
2023-02-28T06:35:19.6340284Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6341034Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.
2023-02-28T06:35:19.6341711Z tests               | withLocalProperties(ThriftLocalProperties.scala:149)
2023-02-28T06:35:19.6342569Z tests               |  at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
2023-02-28T06:35:19.6343777Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6345045Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
2023-02-28T06:35:19.6346148Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
2023-02-28T06:35:19.6347019Z tests               |  at java.security.AccessController.doPrivileged(Native Method)
2023-02-28T06:35:19.6347628Z tests               |  at javax.security.auth.Subject.doAs(Subject.java:422)
2023-02-28T06:35:19.6348342Z tests               |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
2023-02-28T06:35:19.6349336Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
2023-02-28T06:35:19.6350214Z tests               |  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2023-02-28T06:35:19.6350836Z tests               |  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2023-02-28T06:35:19.6351534Z tests               |  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2023-02-28T06:35:19.6352284Z tests               |  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
2023-02-28T06:35:19.6352877Z tests               |  at java.lang.Thread.run(Thread.java:750)
2023-02-28T06:35:19.6354380Z tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6355585Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
2023-02-28T06:35:19.6356469Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
2023-02-28T06:35:19.6357487Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
2023-02-28T06:35:19.6358339Z tests               |  at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
2023-02-28T06:35:19.6359232Z tests               |  at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
2023-02-28T06:35:19.6360111Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
2023-02-28T06:35:19.6360946Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
2023-02-28T06:35:19.6361902Z tests               |  at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
2023-02-28T06:35:19.6362924Z tests               |  at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
2023-02-28T06:35:19.6363917Z tests               |  at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
2023-02-28T06:35:19.6364896Z tests               |  at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:290)
2023-02-28T06:35:19.6365810Z tests               |  at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
2023-02-28T06:35:19.6366731Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
2023-02-28T06:35:19.6367728Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
2023-02-28T06:35:19.6368766Z tests               |  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
2023-02-28T06:35:19.6369762Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:246)
2023-02-28T06:35:19.6370667Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
2023-02-28T06:35:19.6371492Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
2023-02-28T06:35:19.6372402Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
2023-02-28T06:35:19.6373164Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6373915Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
2023-02-28T06:35:19.6374708Z tests               |  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
2023-02-28T06:35:19.6375598Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:246)
2023-02-28T06:35:19.6376610Z tests               |  at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:231)
2023-02-28T06:35:19.6377616Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:244)
2023-02-28T06:35:19.6378565Z tests               |  at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:237)
2023-02-28T06:35:19.6379449Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
2023-02-28T06:35:19.6380219Z tests               |  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
2023-02-28T06:35:19.6381049Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
2023-02-28T06:35:19.6382085Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6383226Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
2023-02-28T06:35:19.6384319Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
2023-02-28T06:35:19.6385368Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6386460Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
2023-02-28T06:35:19.6387291Z tests               |  at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
2023-02-28T06:35:19.6388255Z tests               |  at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:237)
2023-02-28T06:35:19.6389254Z tests               |  at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
2023-02-28T06:35:19.6390236Z tests               |  at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:237)
2023-02-28T06:35:19.6391147Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:191)
2023-02-28T06:35:19.6392099Z tests               |  at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:182)
2023-02-28T06:35:19.6392806Z tests               |  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
2023-02-28T06:35:19.6393676Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
2023-02-28T06:35:19.6394603Z tests               |  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
2023-02-28T06:35:19.6395524Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6396716Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:387)
2023-02-28T06:35:19.6398381Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
2023-02-28T06:35:19.6399571Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6400404Z tests               |  at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:692)
2023-02-28T06:35:19.6401179Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
2023-02-28T06:35:19.6402108Z tests               |  at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
2023-02-28T06:35:19.6403166Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
2023-02-28T06:35:19.6404345Z tests               |  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
2023-02-28T06:35:19.6405046Z tests               |  ... 19 more
2023-02-28T06:35:19.6405407Z tests               | Caused by: 
2023-02-28T06:35:19.6406711Z tests               | org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6407742Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1063)
2023-02-28T06:35:19.6408420Z tests               |  at sun.reflect.GeneratedMethodAccessor394.invoke(Unknown Source)
2023-02-28T06:35:19.6409185Z tests               |  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2023-02-28T06:35:19.6409872Z tests               |  at java.lang.reflect.Method.invoke(Method.java:498)
2023-02-28T06:35:19.6410501Z tests               |  at org.apache.spark.sql.hive.client.Shim_v0_14.dropTable(HiveShim.scala:1373)
2023-02-28T06:35:19.6411255Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$dropTable$1(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6411965Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6412703Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:335)
2023-02-28T06:35:19.6413499Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:236)
2023-02-28T06:35:19.6414484Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:272)
2023-02-28T06:35:19.6415353Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:228)
2023-02-28T06:35:19.6416205Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:315)
2023-02-28T06:35:19.6417089Z tests               |  at org.apache.spark.sql.hive.client.HiveClientImpl.dropTable(HiveClientImpl.scala:642)
2023-02-28T06:35:19.6417932Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1(PoolingHiveClient.scala:352)
2023-02-28T06:35:19.6418786Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6419682Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
2023-02-28T06:35:19.6420576Z tests               |  at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
2023-02-28T06:35:19.6421418Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
2023-02-28T06:35:19.6422144Z tests               |  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
2023-02-28T06:35:19.6422806Z tests               |  at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
2023-02-28T06:35:19.6423575Z tests               |  at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
2023-02-28T06:35:19.6424130Z tests               |  ... 70 more
2023-02-28T06:35:19.6425208Z tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 1a500807-40f7-469b-b34a-24fedf1bc29e; Proxy: null))
2023-02-28T06:35:19.6426384Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
2023-02-28T06:35:19.6427401Z tests               |  at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
2023-02-28T06:35:19.6428405Z tests               |  at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
2023-02-28T06:35:19.6429365Z tests               |  at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
2023-02-28T06:35:19.6430165Z tests               |  at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
2023-02-28T06:35:19.6430628Z tests               |  ... 90 more
2023-02-28T06:35:19.6431301Z tests               | , Query: DROP TABLE IF EXISTS default.test_dl_unsupported_column_mapping_mode_mukz521imk.
2023-02-28T06:35:19.6431789Z tests               |  ... 41 more
2023-02-28T06:35:19.6432127Z tests               | 
2023-02-28T06:35:19.6433636Z tests               | 2023-02-28 12:20:19 INFO: [71 of 132] io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName [name] (Groups: profile_specific_tests, delta-lake-exclude-91, delta-lake-databricks, delta-lake-oss, delta-lake-exclude-73)
findinpath commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4306675973/jobs/7517061578

Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: f34ea97f-2190-43d7-8b7d-9a00244f7e4b; Proxy: null))
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
tests               |   at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
tests               |   at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
tests               |   at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
findinpath commented 1 year ago

I'm looking on how to extend the retry policy when performing DROP TABLE statement on Databricks.

findepi commented 1 year ago

🎉

pajaks commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4435274449/jobs/7782516065 TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName

findepi commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4467443324/jobs/7847242931?pr=16616

tests               | 2023-03-20 17:08:18 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeDropTableCompatibility.testDropTable [TRINO, DELTA, false] (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 6.4 seconds
tests               | 2023-03-20 17:08:18 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 92289870-549a-4b0b-80bf-a9ce0d301d1a; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
tests               |   at java.security.AccessController.doPrivileged(Native Method)
tests               |   at javax.security.auth.Subject.doAs(Subject.java:422)
tests               |   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInf
findepi commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4553222890/jobs/8033737944?pr=16784

tests               | 2023-03-30 01:45:02 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeColumnMappingMode.testUnsupportedOperationsColumnMappingModeName [id] (Groups: profile_specific_tests, delta-lake-exclude-91, delta-lake-databricks, delta-lake-oss, delta-lake-exclude-73) took 10.7 seconds
tests               | 2023-03-30 01:45:02 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 98cf355e-b7cc-4948-9603-2e1d0da3577c; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:47)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:435)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:257)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:123)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:52)
tests               |   at org.apache.spark.sql.hive.thri
findepi commented 1 year ago

https://github.com/trinodb/trino/actions/runs/4766640955/jobs/8476063634?pr=17175

ests               | 2023-04-22 04:04:40 WARNING: not retrying; stacktrace does not match pattern '\Q[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP retry after response received with no Retry-After header, error: HTTP Response code: 503, Error message: Unknown.': [io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: cd0c6c90-dddb-41bb-8d50-91cb7b42f467; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
tests               |   at java.security.AccessController.doPrivileged(Native Method)
tests               |   at javax.security.auth.Subject.doAs(Subject.java:422)
tests               |   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
tests               |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
tests               |   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
tests               |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
tests               |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
tests               |   at java.lang.Thread.run(Thread.java:750)
tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: cd0c6c90-dddb-41bb-8d50-91cb7b42f467; Proxy: null))
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
tests               |   at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
tests               |   at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
tests               |   at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
tests               |   at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
tests               |   at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1214)
tests               |   at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:301)
tests               |   at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:229)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$dropTable$1$adapted(PoolingHiveClient.scala:351)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:149)
tests               |   at org.apache.spark.sql.hive.client.PoolingHiveClient.dropTable(PoolingHiveClient.scala:351)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$dropTable$1(HiveExternalCatalog.scala:635)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:154)
tests               |   ... 70 more
tests               | Caused by: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: cd0c6c90-dddb-41bb-8d50-91cb7b42f467; Proxy: null))
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.getHiveException(CatalogToHiveConverter.java:100)
tests               |   at com.amazonaws.glue.catalog.converters.CatalogToHiveConverter.wrapInHiveException(CatalogToHiveConverter.java:88)
tests               |   at com.amazonaws.glue.catalog.metastore.GlueMetastoreClientDelegate.dropTable(GlueMetastoreClientDelegate.java:492)
tests               |   at com.amazonaws.glue.catalog.metastore.AWSCatalogMetastoreClient.dropTable(AWSCatalogMetastoreClient.java:796)
tests               |   at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1057)
tests               |   ... 90 more
tests               | , Query: DROP TABLE test_schema_with_location_a2fdzcskch.test_managed_table.
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:119)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
tests               |   at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
tests               |   at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
tests               |   at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
tests               |   at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
tests               |   at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
tests               |   at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
tests               |   at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
tests               |   at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
tests               |   at io.trino.tests.product.deltalake.TestDeltaLakeDropTableCompatibility.testDropTableAccuracy(TestDeltaLakeDropTableCompatibility.java:123)
tests               |   at io.trino.tests.product.deltalake.TestDeltaLakeDropTableCompatibility.testDropTable(TestDeltaLakeDropTableCompatibility.java:77)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
tests               |   at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
tests               |   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
tests               |   at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
tests               |   at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
tests               |   at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
tests               |   at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
tests               |   at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
tests               |   at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
tests               |   at java.base/java.lang.Thread.run(Thread.java:833)
tests               | Caused by: java.sql.SQLException: [Databricks][DatabricksJDBCDriver]([5000](https://github.com/trinodb/trino/actions/runs/4766640955/jobs/8476063634?pr=17175#step:7:5001)51) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: cd0c6c90-dddb-41bb-8d50-91cb7b42f467; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
tests               |   at java.security.AccessController.doPrivileged(Native Method)
tests               |   at javax.security.auth.Subject.doAs(Subject.java:422)
tests               |   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
tests               |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
tests               |   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
tests               |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
tests               |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
tests               |   at java.lang.Thread.run(Thread.java:750)
tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Cod
....
findepi commented 1 year ago

https://github.com/trinodb/trino/actions/runs/5057141858/jobs/9075939229?pr=17603

tests               | 2023-05-23 18:32:17 INFO: FAILURE     /    io.trino.tests.product.deltalake.TestDeltaLakeActiveFilesCache.testRefreshTheFilesCacheWhenTableIsRecreated (Groups: profile_specific_tests, delta-lake-databricks, delta-lake-oss) took 8.3 seconds
tests               | 2023-05-23 18:32:17 SEVERE: Failure cause:
tests               | io.trino.tempto.query.QueryExecutionException: java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 2ad6afa4-e98a-4d8f-a852-0b3c11920ebb; Proxy: null))
tests               |   at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
tests               |   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
tests               |   at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
tests               |   at java.security.AccessController.doPrivileged(Native Method)
tests               |   at javax.security.auth.Subject.doAs(Subject.java:422)
tests               |   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
tests               |   at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
tests               |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
tests               |   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
tests               |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
tests               |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
tests               |   at java.lang.Thread.run(Thread.java:750)
tests               | Caused by: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 2ad6afa4-e98a-4d8f-a852-0b3c11920ebb; Proxy: null))
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:163)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:115)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:153)
tests               |   at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:364)
tests               |   at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:152)
tests               |   at org.apache.spark.sql.hive.HiveExternalCatalog.dropTable(HiveExternalCatalog.scala:633)
tests               |   at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropTable(ExternalCatalogWithListener.scala:112)
tests               |   at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.dropTable(SessionCatalog.scala:1389)
tests               |   at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.dropTable(ManagedCatalogSessionCatalog.scala:1222)
tests               |   at com.databricks.sql.DatabricksSessionCatalog.dropTable(DatabricksSessionCatalog.scala:301)
tests               |   at org.apache.spark.sql.execution.command.DropTableCommand.run(ddl.scala:271)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:78)
tests               |   at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:89)
tests               |   at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:229)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
tests               |   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
tests               |   at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
tests               |   at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.execute(Unknown Source)
tests               |   at com.databricks.client.jdbc.common.SStatement.executeNoParams(Unknown Source)
tests               |   at com.databricks.client.jdbc.common.BaseStatement.execute(Unknown Source)
tests               |   at com.databricks.client.hivecommon.jdbc42.Hive42Statement.execute(Unknown Source)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:128)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
tests               |   at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
tests               |   at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
tests               |   at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
tests               |   at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
tests               |   at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
tests               |   at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
tests               |   at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
tests               |   at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
tests               |   at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
tests               |   at io.trino.tests.product.deltalake.TestDeltaLakeActiveFilesCache.testRefreshTheFilesCacheWhenTableIsRecreated(TestDeltaLakeActiveFilesCache.java:70)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
tests               |   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
tests               |   at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
tests               |   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
tests               |   at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
tests               |   at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
tests               |   at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
tests               |   at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
tests               |   at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
tests               |   at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
tests               |   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
tests               |   Suppressed: java.lang.Exception: Query: DROP TABLE default.test_dl_cached_table_files_refresh_bbw267yk1h
tests               |       at io.trino.tempto.query.JdbcQueryExecutor.executeQueryNoParams(JdbcQueryExecutor.java:136)
tests               |       at io.trino.tempto.query.JdbcQueryExecutor.execute(JdbcQueryExecutor.java:112)
tests               |       at io.trino.tempto.query.JdbcQueryExecutor.executeQuery(JdbcQueryExecutor.java:84)
tests               |       at io.trino.tests.product.utils.QueryExecutors$3.lambda$executeQuery$0(QueryExecutors.java:136)
tests               |       at dev.failsafe.Functions.lambda$toCtxSupplier$11(Functions.java:243)
tests               |       at dev.failsafe.Functions.lambda$get$0(Functions.java:46)
tests               |       at dev.failsafe.internal.RetryPolicyExecutor.lambda$apply$0(RetryPolicyExecutor.java:74)
tests               |       at dev.failsafe.SyncExecutionImpl.executeSync(SyncExecutionImpl.java:187)
tests               |       at dev.failsafe.FailsafeExecutor.call(FailsafeExecutor.java:376)
tests               |       at dev.failsafe.FailsafeExecutor.get(FailsafeExecutor.java:112)
tests               |       at io.trino.tests.product.utils.QueryExecutors$3.executeQuery(QueryExecutors.java:136)
tests               |       at io.trino.tests.product.deltalake.TestDeltaLakeActiveFilesCache.testRefreshTheFilesCacheWhenTableIsRecreated(TestDeltaLakeActiveFilesCache.java:70)
tests               |       at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
tests               |       at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
tests               |       at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
tests               |       at java.base/java.lang.reflect.Method.invoke(Method.java:568)
tests               |       at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104)
tests               |       at org.testng.internal.Invoker.invokeMethod(Invoker.java:645)
tests               |       at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:851)
tests               |       at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1177)
tests               |       at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:129)
tests               |       at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:112)
tests               |       at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
tests               |       at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
tests               |       at java.base/java.lang.Thread.run(Thread.java:833)
tests               | Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Table being modified concurrently. (Service: AWSGlue; Status Code: 400; Error Code: ConcurrentModificationException; Request ID: 2ad6afa4-e98a-4d8f-a852-0b3c11920ebb; Proxy: null))
ebyhr commented 1 year ago

TestDeltaLakeActiveFilesCache.testRefreshTheFilesCacheWhenTableIsRecreated

I expect this test was fixed by recent https://github.com/trinodb/trino/commit/f349a0d948b947e3eac70ec089d5c91935f5e784. Let me close.