Open timharsch opened 8 months ago
Documentation explicitly states this:
This resource can only be used with an account-level provider.
On workspace level there is no public API for that.
Thanks @alexott. Yeah that's the core of my issue. I need to do it at the workspace level. At this time I can only do it by navigating the workspace console UI. When you say there is no public API, do you mean just in the terraform provider? I'll take a look at the CLI, maybe I can implement a workaround with local-exec.
Thanks @alexott. Yeah that's the core of my issue. I need to do it at the workspace level. At this time I can only do it by navigating the workspace console UI. When you say there is no public API, do you mean just in the terraform provider? I'll take a look at the CLI, maybe I can implement a workaround with local-exec.
@alexott I spent quite a bit of time looking for a work around with CLI and/or databricks command. No luck
Under Public API I mean here: https://docs.databricks.com/api/ - Databricks CLI & Terraform are generated from the same spec...
In my testing I found out that there indeed is a public API but this isn't documented anywhere. Below are the request details that will work with a Databricks service principal created from within a workspace. Please note the missing account-id
in the URL
curl -X POST --header "Authorization: Bearer <token>" https://<workspace-host-url>/api/2.0/accounts/servicePrincipals/<service-principal-id>/credentials/secrets
Also, we've had contact with support team and they have agreed that this is indeed undocumented and will be supported in our automation scenario
Here's a hack while provider support for this is pending. You should be reluctant to add this to your code base, but for me it's slightly better than a multi-step apply with manual intervention.
We provision a databricks PAT to make an API request through the external
provider. The semantics of that provider are pretty bad for this use case, so it only actually creates the token on the first execution. Therefore you have to ignore_changes
and manually taint resources when you want to update it. It also doesn't work with a separate plan/apply step (lol).
For this example I put the token in a storage blob object. You probably actually want it in an ADO variable group or Azure Keyvault. This, as well as provisioning your SP, PAT, and Workspace is left as an exercise to the reader.
# token.tf
data "external" "service_principal_token" {
program = ["sh", "./create-sp-token.sh", azurerm_databricks_workspace.main.workspace_url, databricks_token.pat.token_value, databricks_service_principal.main.id]
working_dir = path.module
}
resource "azurerm_storage_blob" "databricks_service_principal_token" {
name = "databricks-service-principal-token"
storage_account_name = azurerm_storage_account.main.name
storage_container_name = azurerm_storage_container.secrets.name
type = "Block"
source_content = data.external.service_principal_token.result.secret
# we have to ignore changes because we only get the real value on the first execution
lifecycle {
ignore_changes = [ source_content ]
}
}
# create-sp-token.sh
#!/bin/sh
WORKSPACE_URL=$1
DATABRICKS_PAT=$2
SERVICE_PRINCIPAL_ID=$3
# Uncomment this to delete any existing tokens. Obviously, this is destructive.
# for token in $(curl -s --header "Authorization: Bearer $DATABRICKS_PAT" https://$WORKSPACE_URL/api/2.0/accounts/servicePrincipals/$SERVICE_PRINCIPAL_ID/credentials/secrets | jq -r .secrets[].id) ; do
# curl -X DELETE -s --header "Authorization: Bearer $DATABRICKS_PAT" https://$WORKSPACE_URL/api/2.0/accounts/servicePrincipals/$SERVICE_PRINCIPAL_ID/credentials/secrets/$token
# done
check_token_result=$(curl -s --header "Authorization: Bearer $DATABRICKS_PAT" https://$WORKSPACE_URL/api/2.0/accounts/servicePrincipals/$SERVICE_PRINCIPAL_ID/credentials/secrets)
if [ "$check_token_result" == "{}" ]; then
curl -s -X POST --header "Authorization: Bearer $DATABRICKS_PAT" https://$WORKSPACE_URL/api/2.0/accounts/servicePrincipals/$SERVICE_PRINCIPAL_ID/credentials/secrets
else
echo '{"secret": "token was already created, taint your resource, delete existing tokens and run again"}'
fi
I did try posting a question on SO before filing an issue but after receiving no helpful responses and continued digging in the docs and github repo I'm getting more certain that there just seems to be a missing implementation problem. StackOverflow post: https://stackoverflow.com/questions/78129238/how-to-create-a-databricks-workspace-level-service-principal-using-terraform
As explained there, I can easily create a workspace level service principle like so:
but there does not seem to a way to create an oauth secret for it with the terraform provider. I can test the service principle created by terraform by going to the workspace console. navigating the service principle via "Admin Settings"->"Identity and access" and click the Manage button next to Service Principles. Click the link for "My service principle", click the "Secrets" tab and click "Generate Secret: button. I can then write code that connects using the secret and query tables and data with it. So, how to do the equivalent of the "Generate secret" button using the Databricks terraform provider?
Expected Behavior
The databricks_service_principal_secret should support workspace level.
Actual Behavior
As documented, it does not. So how do we create a secret for a workspace level service prinicple?
Steps to Reproduce
Terraform and provider versions
Is it a regression?
no