databricks-demos / dbdemos

Demos to implement your Databricks Lakehouse
Other
255 stars 80 forks source link

Delta Live Tables stops in PROVISIONING_PIPELINE_RESOURCES state #92

Closed dkim94 closed 5 months ago

dkim94 commented 6 months ago

Whilst running the rag_chatbot demo (01-Data-Preparation-and-Index), I am seeing an error with the Delta Live Tables pipeline.

In the Create the Self-managed vector search using our endpoint cell, the log shows that it is currently provisioning pipeline resources. However, it eventually fails.

Here is the log from the cell:

Waiting for index to be ready, this can take a few min... {'detailed_state': 'PROVISIONING_PIPELINE_RESOURCES', 'message': 'Index is currently pending setup of pipeline resources. Check latest status in Delta Live Tables: https://dbc-73e12b42-36d2.cloud.databricks.com#joblist/pipelines/18350724-c5a7-4af0-9d11-d5b184b74641/updates/f1a8374d-1edf-4f7a-a6a3-b7aff0831d5e', 'indexed_row_count': 0, 'provisioning_status': {}, 'ready': False, 'index_url': 'dbc-73e12b42-36d2.cloud.databricks.com/api/2.0/vector-search/endpoints/dbdemos_vs_endpoint/indexes/main.rag_chatbot.databricks_documentation_vs_index'} - pipeline url:dbc-73e12b42-36d2.cloud.databricks.com/api/2.0/vector-search/endpoints/dbdemos_vs_endpoint/indexes/main.rag_chatbot.databricks_documentation_vs_index

Here is the error message from the Delta Live Tables page. The pipeline fails to initiate with the following message:

DataPlaneException: Failed to start the DLT service on cluster 1212-013037-tgzej5mf. Please check the stack trace below or driver logs for more details.

com.databricks.pipelines.execution.service.UCContextInitializationException: Failed to initialize the UCContext
at com.databricks.pipelines.execution.service.PipelineEnvironment.initUCContext(PipelineEnvironment.scala:421)
at com.databricks.pipelines.execution.service.PipelineEnvironment.init(PipelineEnvironment.scala:378)
at com.databricks.pipelines.execution.service.StartService$.startService(StartService.scala:87)
at com.databricks.pipelines.execution.service.StartService$.main(StartService.scala:34)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command--1:1)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw$$iw.<init>(command--1:43)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw.<init>(command--1:45)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw.<init>(command--1:47)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw.<init>(command--1:49)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw.<init>(command--1:51)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read.<init>(command--1:53)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$.<init>(command--1:57)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$.<clinit>(command--1:-1)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval$.$print$lzycompute(<notebook>:7)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval$.$print(<notebook>:6)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval.$print(<notebook>:-1)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:223)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:236)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:1490)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:1443)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:236)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$34(DriverLocal.scala:1084)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$22(DriverLocal.scala:1067)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:85)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:85)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:1004)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:729)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:721)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:630)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:675)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:507)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:435)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:277)
at java.lang.Thread.run(Thread.java:750)
com.databricks.pipelines.common.CustomException: [DLT ERROR CODE: EXECUTION_SERVICE_STARTUP_FAILURE] [RequestId=3fc6df81-3c5d-4474-973e-c6680c83bc06 ErrorClass=INVALID_STATE] Metastore storage root URL does not exist.
at com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException(ErrorDetailsHandler.scala:33)
at com.databricks.managedcatalog.ErrorDetailsHandler.wrapServiceException$(ErrorDetailsHandler.scala:23)
at com.databricks.managedcatalog.ManagedCatalogClientImpl.wrapServiceException(ManagedCatalogClientImpl.scala:142)
at com.databricks.managedcatalog.ManagedCatalogClientImpl.recordAndWrapException(ManagedCatalogClientImpl.scala:3808)
at com.databricks.managedcatalog.ManagedCatalogClientImpl.createStagingTable(ManagedCatalogClientImpl.scala:660)
at com.databricks.sql.managedcatalog.ManagedCatalogCommon.defaultTablePath(ManagedCatalogCommon.scala:355)
at com.databricks.sql.managedcatalog.ProfiledManagedCatalog.$anonfun$defaultTablePath$1(ProfiledManagedCatalog.scala:183)
at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:537)
at com.databricks.sql.managedcatalog.ProfiledManagedCatalog.$anonfun$profile$1(ProfiledManagedCatalog.scala:55)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at com.databricks.sql.managedcatalog.ProfiledManagedCatalog.profile(ProfiledManagedCatalog.scala:54)
at com.databricks.sql.managedcatalog.ProfiledManagedCatalog.defaultTablePath(ProfiledManagedCatalog.scala:183)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.defaultTablePath(ManagedCatalogSessionCatalog.scala:1005)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createDeltaTable$11(DeltaCatalog.scala:224)
at scala.Option.getOrElse(Option.scala:189)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createDeltaTable$1(DeltaCatalog.scala:224)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:266)
at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:264)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:107)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.com$databricks$sql$transaction$tahoe$catalog$DeltaCatalog$$createDeltaTable(DeltaCatalog.scala:148)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createTableWithRowColumnControls$1(DeltaCatalog.scala:740)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:266)
at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:264)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:107)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTableWithRowColumnControls(DeltaCatalog.scala:711)
at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTable(DeltaCatalog.scala:701)
at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.createTable(UnityCatalogV2Proxy.scala:219)
at org.apache.spark.sql.connector.catalog.TableCatalog.createTable(TableCatalog.java:244)
at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:56)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$2(V2CommandExec.scala:48)
at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:176)
at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:187)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:48)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:56)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:286)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:286)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$8(SQLExecution.scala:320)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:579)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:223)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1146)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:155)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:521)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:285)
at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:259)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:280)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:265)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:472)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:472)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:39)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:316)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:312)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:448)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:265)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:373)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:265)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:218)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:215)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:262)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:123)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1146)
at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1153)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1153)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:113)
at org.apache.spark.sql.SparkSession.$anonfun$sql$5(SparkSession.scala:925)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1146)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:914)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:948)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:981)
at com.databricks.pipelines.execution.core.BaseUCContext.$anonfun$getOrCreateMaterializationPath$1(BaseUCContext.scala:151)
at scala.Option.getOrElse(Option.scala:189)
at com.databricks.pipelines.execution.core.BaseUCContext.getOrCreateMaterializationPath(BaseUCContext.scala:147)
at com.databricks.pipelines.execution.core.BaseUCContext.getOrCreateEventLogPath(BaseUCContext.scala:269)
at com.databricks.pipelines.execution.core.BaseUCContext.$anonfun$init$1(BaseUCContext.scala:332)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:104)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:103)
at com.databricks.pipelines.execution.core.BaseUCContext.init(BaseUCContext.scala:289)
at com.databricks.pipelines.execution.core.UCContextCompanion.create(BaseUCContext.scala:965)
at com.databricks.pipelines.execution.core.UCContextCompanion.create$(BaseUCContext.scala:948)
at com.databricks.pipelines.execution.core.UCContext$.create(UCContext_DBR_13_3_Plus.scala:31)
at com.databricks.pipelines.execution.service.PipelineEnvironment.$anonfun$initUCContext$1(PipelineEnvironment.scala:410)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.pipelines.execution.service.PipelineEnvironment.initUCContext(PipelineEnvironment.scala:402)
at com.databricks.pipelines.execution.service.PipelineEnvironment.init(PipelineEnvironment.scala:378)
at com.databricks.pipelines.execution.service.StartService$.startService(StartService.scala:87)
at com.databricks.pipelines.execution.service.StartService$.main(StartService.scala:34)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command--1:1)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw$$iw.<init>(command--1:43)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw$$iw.<init>(command--1:45)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw$$iw.<init>(command--1:47)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw$$iw.<init>(command--1:49)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$$iw.<init>(command--1:51)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read.<init>(command--1:53)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$.<init>(command--1:57)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$read$.<clinit>(command--1:-1)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval$.$print$lzycompute(<notebook>:7)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval$.$print(<notebook>:6)
at $line509f4bff4f394bcdb8a304b0e0c0055525.$eval.$print(<notebook>:-1)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-2)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:223)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:236)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:1490)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:1443)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:236)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$34(DriverLocal.scala:1084)
at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:99)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$22(DriverLocal.scala:1067)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:85)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:85)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:1004)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:729)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:721)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:630)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:675)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:507)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:435)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:277)
at java.lang.Thread.run(Thread.java:750)

Has anyone seen or solved this situation?

(Images will be updated, can't seem to upload them at the moment. Instead you can go to this page and check 01-Data-Preparation-and-Index. The cell I mentioned is near the bottom.)

QuentinAmbard commented 6 months ago

Hi, the best way to know is to open the DLT pipeline job and check the error you have here!

dkim94 commented 6 months ago

Hi, the best way to know is to open the DLT pipeline job and check the error you have here!

Thanks for the reply. But the log pasted above is the error from the DLT pipeline. It says something about the Metastore storage root URL not existing. Any suggestions on how to solve this? Or should I be looking somewhere else?

QuentinAmbard commented 6 months ago

sorry I read that too quickly on my phone. is your catalog/schema working properly,do you have data being written in your tables ? (can you do a select on your table & see the data before creating the index?)

dkim94 commented 6 months ago

Yes, I followed the demo and created the databricks_documentation table inside main.rag_chatbot. I can see the content from the Sample Data tab, and from the cell in the demo which executes a spark query to show the table.

dkim94 commented 6 months ago

This is the standard error log from the Delta Live Tables workflow.

ANTLR Tool version 4.8 used for code generation does not match the current runtime version 4.9.3

I tried to update the version, but could not access the compute (cluster). Could this be the problem? If so, how can I fix this?

QuentinAmbard commented 6 months ago

hey @dkim94 , I checked with our team and it looks like something isn't properly setup with your metastore. I'd double check the permissions (https://docs.databricks.com/en/data-governance/unity-catalog/create-metastore.html#step-2-create-an-iam-role-to-access-the-storage-location) If it's still not working could you reach out the support or your account team? They'd be able to provide better help!

QuentinAmbard commented 5 months ago

hey, I'm going to close this, please reopen if you still have an issue