Open filogzz52 opened 3 days ago
@filogzz52 : Could you run this in --debug
mode: databricks --debug labs ucx ...
? And share the error that this command fails on?
Notes:
@filogzz52 : Could you run this in
--debug
mode:databricks --debug labs ucx ...
? And share the error that this command fails on?
This is the --debug results:
DEBUG [databricks.sdk] PUT /api/2.0/sql/config/warehouses
> {
> "data_access_config": [
> {
> "key": "spark.hadoop.javax.jdo.option.ConnectionPassword",
> "value": "**REDACTED**"
> },
> {
> "key": "spark.hadoop.javax.jdo.option.ConnectionURL",
> "value": "**REDACTED**"
> },
> "... (78 additional elements)"
> ],
> "security_policy": "DATA_ACCESS_CONTROL",
> "sql_configuration_parameters": {}
> }
< 400 Bad Request
< {
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net is not supp... (577 more bytes)"
< }
16:57:02 ERROR [d.l.u.azure.access] Adding uber principal to SQL warehouse Data Access Properties is failed using Python SDK with error "spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net is not supported in data access configuration for Databricks SQL. Keys must match the following patterns – spark.databricks.hive.metastore.glueCatalog.enabled, spark.sql.hive.metastore.*, spark.sql.warehouse.dir, spark.hadoop.aws.glue.*, spark.hadoop.aws.region, spark.hadoop.datanucleus.*, spark.hadoop.fs.*, spark.hadoop.hive.*, spark.hadoop.javax.jdo.option.*, spark.hive.*, spark.sql.session.timeZone, spark.databricks.delta.catalog.update.enabled, spark.databricks.cloudfetch.override.enabled, spark.databricks.dataLineage.enabled, spark.databricks.hive.metastore.client.pool.type.". Please try applying the following configs manually in the worksapce admin UI:
spark.hadoop.javax.jdo.option.ConnectionPassword [REDACTED]
spark.hadoop.javax.jdo.option.ConnectionURL [REDACTED]
spark.sql.hive.metastore.jars maven
spark.sql.hive.metastore.version 3.1.0
spark.hadoop.javax.jdo.option.ConnectionUserName [REDACTED]
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
spark_conf.fs.azure.account.oauth2.client.id.[REDACTED].dfs.core.windows.net [REDACTED]
spark_conf.fs.azure.account.oauth.provider.type.[REDACTED].dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark_conf.fs.azure.account.oauth2.client.endpoint.[REDACTED].dfs.core.windows.net https://login.microsoftonline.com/[REDACTED]/oauth2/token
spark_conf.fs.azure.account.auth.type.[REDACTED].dfs.core.windows.net OAuth
spark_conf.fs.azure.account.oauth2.client.secret.[REDACTED].dfs.core.windows.net {{secrets/ucx_2700324992101495/uber_principal_secret}}
16:57:02 DEBUG [d.l.blueprint.installation] Loading WorkspaceConfig from config.yml
16:57:03 DEBUG [databricks.sdk] GET /api/2.0/workspace/export?path=/Applications/ucx/config.yml&direct_download=true
< 200 OK
< [raw stream]
16:57:03 DEBUG [databricks.sdk] Ignoring pat auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Ignoring basic auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Ignoring metadata-service auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Ignoring oauth-m2m auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Ignoring azure-client-secret auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Ignoring github-oidc-azure auth, because azure-cli is preferred
16:57:03 DEBUG [databricks.sdk] Attempting to configure auth: azure-cli
16:57:06 INFO [databricks.sdk] Using Azure CLI authentication with AAD tokens
16:57:06 DEBUG [d.l.u.framework.crawlers] [hive_metastore.ucx_2700324992101495.external_locations] fetching external_locations inventory
16:57:06 DEBUG [d.l.lsql.backends] [api][fetch] SELECT * FROM `hive_metastore`.`ucx_2700324992101495`.`external_locations`
16:57:06 DEBUG [d.l.lsql.core] Executing SQL statement: SELECT * FROM `hive_metastore`.`ucx_2700324992101495`.`external_locations`
16:57:06 DEBUG [databricks.sdk] POST /api/2.0/sql/statements/
> {
> "format": "JSON_ARRAY",
> "statement": "SELECT * FROM `hive_metastore`.`ucx_2700324992101495`.`external_locations`",
> "warehouse_id": "d6354f6edb594d88"
> }
< 200 OK
< {
< "manifest": {
< "chunks": [
< {
< "chunk_index": 0,
< "row_count": 45,
< "row_offset": 0
< }
< ],
< "format": "JSON_ARRAY",
< "schema": {
< "column_count": 2,
< "columns": [
< {
< "name": "location",
< "position": 0,
< "type_name": "STRING",
< "type_text": "STRING"
< },
< "... (1 additional elements)"
< ]
< },
< "total_chunk_count": 1,
< "total_row_count": 45,
< "truncated": false
< },
< "result": {
< "chunk_index": 0,
< "data_array": [
< [
< "abfss://[REDACTED].dfs.core.windows.net/-parquet/",
< "... (1 additional elements)"
< ],
< "... (44 additional elements)"
< ],
< "row_count": 45,
< "row_offset": 0
< },
< "statement_id": "01ef7f7f-515e-104d-a52f-ac2473425c04",
< "status": {
< "state": "SUCCEEDED"
< }
< }
16:57:06 DEBUG [databricks.sdk] GET /subscriptions?api-version=2022-12-01
< 200 OK
< {
< "count": {
< "type": "Total",
< "value": "**REDACTED**"
< },
< "value": "**REDACTED**"
< }
16:57:06 INFO [d.l.u.assessment.crawlers] Checking in subscription [REDACTED] for storage accounts
16:57:07 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/providers/Microsoft.Storage/storageAccounts?api-version=2023-01-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:07 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:08 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:08 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:08 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:08 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:08 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:09 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:09 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:09 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:09 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:09 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:10 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:10 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:10 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:10 DEBUG [databricks.sdk] GET /subscriptions/[REDACTED]/resourceGroups/[REDACTED]/providers/Microsoft.Storage/storageAccounts/[REDACTED]/providers/Microsoft.Authorization/roleAssignments?$filter=principalId eq '[REDACTED]'&api-version=2022-04-01
< 200 OK
< {
< "value": "**REDACTED**"
< }
16:57:11 DEBUG [databricks.sdk] DELETE /v1.0/applications(appId='[REDACTED]')
< 204 No Content
16:57:11 DEBUG [d.l.blueprint.installation] Loading Policy from policy-backup.json
16:57:11 DEBUG [databricks.sdk] GET /api/2.0/workspace/export?path=/Applications/ucx/policy-backup.json&direct_download=true
< 200 OK
< [raw stream]
16:57:11 DEBUG [databricks.sdk] POST /api/2.0/policies/clusters/edit
> {
> "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"15.4.x-scala2.12\"}, \"node_type_id\": {\"type\": \"fixe... (899 more bytes)",
> "name": "Unity Catalog Migration (ucx_2700324992101495) ([REDACTED])",
> "policy_id": "001E144F2DD6EA16"
> }
< 200 OK
< {}
16:57:11 DEBUG [d.l.blueprint.installation] Loading GetWorkspaceWarehouseConfigResponse from warehouse-config-backup.json
16:57:11 DEBUG [databricks.sdk] GET /api/2.0/workspace/export?path=/Applications/ucx/warehouse-config-backup.json&direct_download=true
< 200 OK
< [raw stream]
16:57:11 DEBUG [databricks.sdk] PUT /api/2.0/sql/config/warehouses
> {
> "data_access_config": [
> {
> "key": "spark.hadoop.javax.jdo.option.ConnectionPassword",
> "value": "**REDACTED**"
> },
> {
> "key": "spark.hadoop.javax.jdo.option.ConnectionURL",
> "value": "**REDACTED**"
> },
> "... (3 additional elements)"
> ],
> "security_policy": "DATA_ACCESS_CONTROL",
> "sql_configuration_parameters": {}
> }
< 400 Bad Request
< {
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "enable_serverless_compute is required."
< }
16:57:11 ERROR [d.l.u.azure.access] Adding uber principal to SQL warehouse Data Access Properties is failed using Python SDK with error "enable_serverless_compute is required.". Please try applying the following configs manually in the workspace admin UI:
spark.hadoop.javax.jdo.option.ConnectionPassword [REDACTED]
spark.hadoop.javax.jdo.option.ConnectionURL [REDACTED]
spark.sql.hive.metastore.jars maven
spark.sql.hive.metastore.version 3.1.0
spark.hadoop.javax.jdo.option.ConnectionUserName [REDACTED]
16:57:11 ERROR [src/databricks/labs/ucx.create-uber-principal] Failed to call create-uber-principal: Traceback (most recent call last):
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\labs\blueprint\cli.py", line 113, in _route
cmd.fn(**kwargs)
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\cli.py", line 363, in create_uber_principal
workspace_context.azure_resource_permissions.create_uber_principal(prompts)
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\azure\access.py", line 419, in create_uber_principal
self._delete_uber_principal() # Clean up dangling resources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\azure\access.py", line 456, in _delete_uber_principal
log_permission_denied(
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\azure\access.py", line 428, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\azure\access.py", line 374, in _remove_service_principal_configuration_from_workspace_warehouse_config
raise error
File "C:\Users\cenip\.databricks\labs\ucx\lib\src\databricks\labs\ucx\azure\access.py", line 362, in _remove_service_principal_configuration_from_workspace_warehouse_config
self._ws.warehouses.set_workspace_warehouse_config(
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\sdk\service\sql.py", line 7297, in set_workspace_warehouse_config
self._api.do('PUT', '/api/2.0/sql/config/warehouses', body=body, headers=headers)
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\sdk\core.py", line 157, in do
response = retryable(self._perform)(method,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\sdk\retries.py", line 54, in wrapper
raise err
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\sdk\retries.py", line 33, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\cenip\.databricks\labs\ucx\state\venv\Lib\site-packages\databricks\sdk\core.py", line 247, in _perform
raise error from None
databricks.sdk.errors.platform.InvalidParameterValue: enable_serverless_compute is required.
16:57:11 INFO completed execution pid=3452 exit_code=0
Is there an existing issue for this?
Current Behavior
Our customer is using an external Azure Hive metastore mounted in DBFS for their Databricks environment. We have successfully completed all the preliminary steps of the UCX workflow up to the execution of the create-uber-principal command. However, when running this command, we encounter an error related to adding the Uber Principal to the SQL Warehouse Data Access Properties. It does successfully create the service principal for several storage locations, but when it gets to cu1af2 (a specific storage location), the Data Access Properties error breaks the flow. I have already manually configured the config.yml file with the error message suggestions.
Also, the error message involves some AWS assets that are not related to the azure env.
Expected Behavior
The create-uber-principal command should successfully create a service principal and configure the required properties in the Databricks workspace for migration workflows without encountering configuration compatibility issues.
Steps To Reproduce
Cloud
Azure
Operating System
Windows
Version
latest via Databricks CLI
Relevant log output