relferreira / metabase-sparksql-databricks-driver

GNU Affero General Public License v3.0
31 stars 32 forks source link

Database not found even though it exists in SQL warehouse #12

Open ektormak opened 1 year ago

ektormak commented 1 year ago

Hello,

I am trying to run queries on pre-existing catalog and schemas using an existing SQL warehouse in Databricks. However when I create the connection in Metabase it is stuck in syncing tables state when I enter the database of choice, I see below when looking at the Metabase server logs:

metabase       | 2023-01-16 17:03:28,023 DEBUG middleware.log :: PUT /api/database/2 200 896.4 ms (8 DB calls) App DB connections: 0/7 Jetty threads: 5/50 (2 idle, 0 queued) (63 total active threads) Queries in flight: 0 (0 queued)
metabase       | 2023-01-16 17:03:28,059 DEBUG middleware.log :: GET /api/database 200 6.0 ms (3 DB calls) App DB connections: 0/7 Jetty threads: 5/50 (2 idle, 0 queued) (63 total active threads) Queries in flight: 0 (0 queued)
metabase       | 2023-01-16 17:03:31,325 INFO sync.util :: STARTING: Sync metadata for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:31,326 DEBUG middleware.log :: POST /api/database/2/sync_schema 200 1.9 ms (3 DB calls) App DB connections: 0/7 Jetty threads: 5/50 (2 idle, 0 queued) (61 total active threads) Queries in flight: 0 (0 queued)
metabase       | 2023-01-16 17:03:31,326 INFO sync.util :: STARTING: step 'sync-timezone' for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:31,326 INFO sync.util :: FINISHED: step 'sync-timezone' for sparksql-databricks Database 2 'OI Databricks' (85.6 µs)
metabase       | 2023-01-16 17:03:31,326 INFO sync.util :: STARTING: step 'sync-tables' for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:32,105 WARN sync.util :: Error running step 'sync-tables' for sparksql-databricks Database 2 'OI Databricks'
metabase       | java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'metabase_poc.my_schema' not found
metabase       |    at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
metabase       |    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
metabase       |    at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:36)
metabase       |    at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:58)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
metabase       |    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
metabase       |    at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
metabase       |    at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
metabase       |    at java.security.AccessController.doPrivileged(Native Method)
metabase       |    at javax.security.auth.Subject.doAs(Subject.java:422)
metabase       |    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
metabase       |    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
metabase       |    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
metabase       |    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
metabase       |    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
metabase       |    at java.lang.Thread.run(Thread.java:750)
metabase       | Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'metabase_poc.my_schema' not found
metabase       |    at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.requireDbExists(SessionCatalog.scala:720)
metabase       |    at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.doListTables(SessionCatalog.scala:1619)
metabase       |    at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:1388)
metabase       |    at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listTables(ManagedCatalogSessionCatalog.scala:1484)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.$anonfun$listTables$1(UnityCatalogV2Proxy.scala:189)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.assertSingleNamespace(UnityCatalogV2Proxy.scala:104)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listTables(UnityCatalogV2Proxy.scala:188)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:241)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
metabase       |    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:241)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:226)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:239)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
metabase       |    at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:186)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:177)
metabase       |    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
metabase       |    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:383)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
metabase       |    at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:691)
metabase       |    at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
metabase       |    at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
metabase       |    ... 20 more
metabase       | , Query: show table***.
metabase       |    at com.databricks.client.hivecommon.api.HS2Client.buildExceptionFromTStatusSqlState(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.api.HS2Client.pollForOperationCompletion(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.api.HS2Client.executeStatementInternal(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.api.HS2Client.executeStatement(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.executeNonRowCountQueryHelper(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.executeQuery(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.dataengine.HiveJDBCNativeQueryExecutor.<init>(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.dataengine.HiveJDBCDataEngine.prepare(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SPreparedStatement.<init>(Unknown Source)
metabase       |    at com.databricks.client.jdbc.jdbc41.S41PreparedStatement.<init>(Unknown Source)
metabase       |    at com.databricks.client.jdbc.jdbc42.S42PreparedStatement.<init>(Unknown Source)
metabase       |    at com.databricks.client.hivecommon.jdbc42.Hive42PreparedStatement.<init>(Unknown Source)
metabase       |    at com.databricks.client.spark.jdbc.SparkJDBCObjectFactory.createPreparedStatement(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.JDBCObjectFactory.newPreparedStatement(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection$5.create(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection$5.create(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection$StatementCreator.create(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection.prepareStatement(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection.prepareStatement(Unknown Source)
metabase       |    at com.databricks.client.jdbc.common.SConnection.prepareStatement(Unknown Source)
metabase       |    at jdk.internal.reflect.GeneratedMethodAccessor232.invoke(Unknown Source)
metabase       |    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
metabase       |    at java.base/java.lang.reflect.Method.invoke(Unknown Source)
metabase       |    at clojure.lang.Reflector.invokeMatchingMethod(Reflector.java:167)
metabase       |    at clojure.lang.Reflector.invokeInstanceMethod(Reflector.java:102)
metabase       |    at metabase.driver.connection$decorate_and_fix$fn__8407.invoke(connection.clj:40)
metabase       |    at metabase.driver.connection.proxy$java.lang.Object$Connection$1d5212e.prepareStatement(Unknown Source)
metabase       |    at com.mchange.v2.c3p0.impl.NewProxyConnection.prepareStatement(NewProxyConnection.java:567)
metabase       |    at clojure.java.jdbc$prepare_statement.invokeStatic(jdbc.clj:679)
metabase       |    at clojure.java.jdbc$prepare_statement.invoke(jdbc.clj:626)
metabase       |    at clojure.java.jdbc$db_query_with_resultset_STAR_.invokeStatic(jdbc.clj:1105)
metabase       |    at clojure.java.jdbc$db_query_with_resultset_STAR_.invoke(jdbc.clj:1093)
metabase       |    at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1182)
metabase       |    at clojure.java.jdbc$query.invoke(jdbc.clj:1144)
metabase       |    at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1160)
metabase       |    at clojure.java.jdbc$query.invoke(jdbc.clj:1144)
metabase       |    at metabase.driver.sparksql_databricks$fn__81296$fn__81297.invoke(sparksql_databricks.clj:132)
metabase       |    at metabase.driver.sparksql_databricks$fn__81296.invokeStatic(sparksql_databricks.clj:130)
metabase       |    at metabase.driver.sparksql_databricks$fn__81296.invoke(sparksql_databricks.clj:127)
metabase       |    at clojure.lang.MultiFn.invoke(MultiFn.java:234)
metabase       |    at metabase.sync.fetch_metadata$fn__68316$db_metadata__68321$fn__68322.invoke(fetch_metadata.clj:14)
metabase       |    at metabase.sync.fetch_metadata$fn__68316$db_metadata__68321.invoke(fetch_metadata.clj:11)
metabase       |    at metabase.sync.sync_metadata.tables$fn__69631$sync_tables_and_database_BANG___69636$fn__69637.invoke(tables.clj:176)
metabase       |    at metabase.sync.sync_metadata.tables$fn__69631$sync_tables_and_database_BANG___69636.invoke(tables.clj:170)
metabase       |    at clojure.lang.AFn.applyToHelper(AFn.java:154)
metabase       |    at clojure.lang.AFn.applyTo(AFn.java:144)
metabase       |    at clojure.core$apply.invokeStatic(core.clj:669)
metabase       |    at clojure.core$apply.invoke(core.clj:662)
metabase       |    at metabase.sync.util$fn__42819$run_step_with_metadata__42824$fn__42828$fn__42830.doInvoke(util.clj:388)
metabase       |    at clojure.lang.RestFn.invoke(RestFn.java:397)
metabase       |    at metabase.sync.util$with_start_and_finish_logging_STAR_.invokeStatic(util.clj:102)
metabase       |    at metabase.sync.util$with_start_and_finish_logging_STAR_.invoke(util.clj:96)
metabase       |    at metabase.sync.util$with_start_and_finish_debug_logging.invokeStatic(util.clj:119)
metabase       |    at metabase.sync.util$with_start_and_finish_debug_logging.invoke(util.clj:116)
metabase       |    at metabase.sync.util$fn__42819$run_step_with_metadata__42824$fn__42828.invoke(util.clj:383)
metabase       |    at metabase.sync.util$fn__42819$run_step_with_metadata__42824.invoke(util.clj:378)
metabase       |    at metabase.sync.util$fn__43040$run_sync_operation__43045$fn__43046$fn__43054.invoke(util.clj:495)
metabase       |    at metabase.sync.util$fn__43040$run_sync_operation__43045$fn__43046.invoke(util.clj:493)
metabase       |    at metabase.sync.util$fn__43040$run_sync_operation__43045.invoke(util.clj:487)
metabase       |    at metabase.sync.sync_metadata$fn__70714$sync_db_metadata_BANG___70719$fn__70720$fn__70721.invoke(sync_metadata.clj:50)
metabase       |    at metabase.sync.util$do_with_error_handling.invokeStatic(util.clj:160)
metabase       |    at metabase.sync.util$do_with_error_handling.invoke(util.clj:153)
metabase       |    at clojure.core$partial$fn__5910.invoke(core.clj:2647)
metabase       |    at metabase.driver$fn__33693.invokeStatic(driver.clj:626)
metabase       |    at metabase.driver$fn__33693.invoke(driver.clj:626)
metabase       |    at clojure.lang.MultiFn.invoke(MultiFn.java:239)
metabase       |    at metabase.sync.util$sync_in_context$fn__42740.invoke(util.clj:138)
metabase       |    at metabase.sync.util$with_db_logging_disabled$fn__42737.invoke(util.clj:129)
metabase       |    at metabase.sync.util$with_start_and_finish_logging_STAR_.invokeStatic(util.clj:102)
metabase       |    at metabase.sync.util$with_start_and_finish_logging_STAR_.invoke(util.clj:96)
metabase       |    at metabase.sync.util$with_start_and_finish_logging$fn__42726.invoke(util.clj:114)
metabase       |    at metabase.sync.util$with_sync_events$fn__42721.invoke(util.clj:88)
metabase       |    at metabase.sync.util$with_duplicate_ops_prevented$fn__42712.invoke(util.clj:67)
metabase       |    at metabase.sync.util$do_sync_operation.invokeStatic(util.clj:181)
metabase       |    at metabase.sync.util$do_sync_operation.invoke(util.clj:178)
metabase       |    at metabase.sync.sync_metadata$fn__70714$sync_db_metadata_BANG___70719$fn__70720.invoke(sync_metadata.clj:49)
metabase       |    at metabase.sync.sync_metadata$fn__70714$sync_db_metadata_BANG___70719.invoke(sync_metadata.clj:46)
metabase       |    at metabase.api.database$fn__79428$fn__79429.invoke(database.clj:890)
metabase       |    at clojure.core$binding_conveyor_fn$fn__5823.invoke(core.clj:2047)
metabase       |    at clojure.lang.AFn.call(AFn.java:18)
metabase       |    at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
metabase       |    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
metabase       |    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
metabase       | Caused by: com.databricks.client.support.exceptions.GeneralException: [Databricks][DatabricksJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'metabase_poc.my_schema' not found
metabase       |    at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:498)
metabase       |    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
metabase       |    at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:36)
metabase       |    at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:58)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:410)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:321)
metabase       |    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
metabase       |    at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:149)
metabase       |    at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:49)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:54)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:299)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:284)
metabase       |    at java.security.AccessController.doPrivileged(Native Method)
metabase       |    at javax.security.auth.Subject.doAs(Subject.java:422)
metabase       |    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:333)
metabase       |    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
metabase       |    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
metabase       |    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
metabase       |    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
metabase       |    at java.lang.Thread.run(Thread.java:750)
metabase       | Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'metabase_poc.my_schema' not found
metabase       |    at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.requireDbExists(SessionCatalog.scala:720)
metabase       |    at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.doListTables(SessionCatalog.scala:1619)
metabase       |    at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:1388)
metabase       |    at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.listTables(ManagedCatalogSessionCatalog.scala:1484)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.$anonfun$listTables$1(UnityCatalogV2Proxy.scala:189)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.assertSingleNamespace(UnityCatalogV2Proxy.scala:104)
metabase       |    at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.listTables(UnityCatalogV2Proxy.scala:188)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
metabase       |    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:241)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:243)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:392)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:188)
metabase       |    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:142)
metabase       |    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:342)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:241)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:226)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:239)
metabase       |    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
metabase       |    at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:99)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
metabase       |    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:232)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:186)
metabase       |    at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:177)
metabase       |    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:238)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:404)
metabase       |    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:985)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:397)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:383)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:397)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:446)
metabase       |    at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:691)
metabase       |    at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency(QueryResultCache.scala:149)
metabase       |    at org.apache.spark.sql.execution.qrc.CacheEventLogger.recordLatency$(QueryResultCache.scala:145)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.recordLatency(SparkExecuteStatementOperation.scala:54)
metabase       |    at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:446)
metabase       |    ... 20 more
metabase       | , Query: show table***.
metabase       |    ... 83 more
metabase       | 2023-01-16 17:03:32,107 INFO sync.util :: FINISHED: step 'sync-tables' for sparksql-databricks Database 2 'OI Databricks' (780.5 ms)
metabase       | 2023-01-16 17:03:32,107 INFO sync.util :: STARTING: step 'sync-fields' for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:32,108 INFO sync.util :: FINISHED: step 'sync-fields' for sparksql-databricks Database 2 'OI Databricks' (623.8 µs)
metabase       | 2023-01-16 17:03:32,108 INFO sync.util :: STARTING: step 'sync-fks' for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:32,108 INFO sync.util :: FINISHED: step 'sync-fks' for sparksql-databricks Database 2 'OI Databricks' (399.8 µs)
metabase       | 2023-01-16 17:03:32,109 INFO sync.util :: STARTING: step 'sync-metabase-metadata' for sparksql-databricks Database 2 'OI Databricks'
metabase       | 2023-01-16 17:03:32,431 WARN sync.util :: Error syncing _metabase_metadata table for sparksql-databricks Database 2 'OI Databricks'

here is metabase_poc.my_schema is the fully qualified name of my schema in Databrciks. For example I can query my data with

select * from metabase_poc.my_schema.my_table;

If I open the query editor in Metabase and execute the above query it works. But you have no way of exploring the tables and finding out what useful visualizations you can build since in the "Browse data" section it shows "This database doesn't have any tables".

josei commented 1 year ago

I'd say you need to specify the catalog and schema in the JDBC connection string: ...;ConnCatalog=metabase_poc;ConnSchema=my_schema