Closed davidzenisu closed 2 months ago
could be related to https://github.com/databricks/cli/issues/1080
@davidhoferzeni @andrewnester
Looks like the error message changed, and now provides a direction what the problem might be:
< HTTP/2.0 403 Forbidden
< {
< "details": [
< {
< "@type": "type.googleapis.com/google.rpc.RequestInfo",
< "request_id": "<request-id>",
< "serving_data": ""
< }
< ],
< "error_code": "PERMISSION_DENIED",
< "message": "AAD Token exchange using Azure Managed Identity Credential with Access Credential Id <some-id>... (33 more bytes)"
The info within message
is not perfect but with PERMISSION_DENIED
it points in the correct direction.
Adding CREATE_CONNECTION
with databricks grants update metastore
fixed the issue for me.
The service principal is now able to create the connection successfully.
@hargut
Thanks a lot for the update, I will retry today and report my findings!
@hargut
I've retried with the most current databricks CLI version (v0.212.1). Unfortunately, I still get the same result (as in my original debug logs):
"error_code": "INTERNAL_ERROR",
"message": ""
Also, if I configure the service principal as a "Account admin" creating the storage credential works without any problems, so I'm assuming it is related to a configuration issue on workspace and/catalog level (and not related to something Azure specific).
In any case, thanks a lot for the support, I'll try to bring it up in an upcoming sessions with the developer directly!
@davidhoferzeni Account Admin will likely not work, this is a Unity Catalog / Metastore permission. It only started working for me after adding the permission described above. Not sure if Account Admin covers the metastore admin for all metastores, if the metastore is created manually the default metastore admin is only the user that created it.
@davidzenisu does the issue still persist for you in the latest CLI?
Closing as no response, feel free to reopen if the issue persists
Since I don't have access to my original testing setup anymore, I'm currently unable to verify if the issue persists. Thanks for updating the issue, I will create a separate case if I gain additional insights.
Describe the issue
Using an Azure Service Principal for authentication (as documented here) to create a storage-credential fails error code 500 and without any error message (see debug logs further down below):
The same problem occurs when trying to roll out the storage credential using terraform.
Seems to be a similar situation with service principal credentials (but a different way of authentication): https://github.com/databricks/cli/issues/1080 https://github.com/databricks/terraform-provider-databricks/issues/3022
Steps to reproduce the behavior
AZURESP
authenticating with a service principal client & secret (as documented here)databricks storage-credentials create --json '<json_content> --profile AZURESP
trying to create a storage-credentialError:
without any additional message.Expected Behavior
Storage credential is created successfully.
Actual Behavior
Return code 500 and no error message.
OS and CLI version
Please include the version of the CLI (eg: v0.1.2) and the operating system (eg: windows). You can run databricks --version to get the version of your Databricks CLI Databricks CLI v0.210.2 Linux (WSL 2, Ubuntu 22.04.2 LTS)
Is this a regression?
No.
Debug Logs