databricks / terraform-provider-databricks

Databricks Terraform Provider
https://registry.terraform.io/providers/databricks/databricks/latest
Other
456 stars 393 forks source link

[ISSUE] Issue with `databricks_catalog` resource #4136

Open LittleWat opened 1 month ago

LittleWat commented 1 month ago

related issue

Configuration

locals {
  catalog_name = "${var.workspace_name}-${var.environment}${var.catalog_suffix}"
}

resource "databricks_catalog" "workspace_catalog" {
  provider       = databricks.workspace
  name           = local.catalog_name
  storage_root   = databricks_external_location.catalog_location.url
  isolation_mode = "ISOLATED"
}

resource "databricks_external_location" "catalog_location" {
  provider        = databricks.workspace
  name            = "${local.catalog_name}-location"
  credential_name = databricks_storage_credential.external.id
  url             = "s3://${module.domain_storage.bucket_name}"
  depends_on      = [aws_iam_role.external_data_access, time_sleep.wait_role_creation]

  force_destroy = true
  force_update  = true
}

Expected Behavior

Should apply successfully

Actual Behavior

Rerunning apply solves this but

We need to rerun to avoid the error:

╷
  │ Error: Provider produced inconsistent final plan
  │ 
  │ When expanding the plan for databricks_catalog.workspace_catalog to include
  │ new values learned so far during apply, provider
  │ "registry.opentofu.org/databricks/databricks" produced an invalid new value
  │ for .storage_root: was
  │ cty.StringVal("s3://my-storage"), but now
  │ cty.StringVal("s3://my-storage/").
  │ 
  │ This is a bug in the provider, which should be reported in the provider's
  │ own issue tracker.

Steps to Reproduce

Terraform and provider versions

Is it a regression?

Debug Output

Important Factoids

Would you like to implement a fix?

LittleWat commented 3 weeks ago

this was solved by simply adding / at the end of the s3 URL as follows:


resource "databricks_external_location" "catalog_location" {
  provider        = databricks.workspace
  name            = "${local.catalog_name}-location"
  credential_name = databricks_storage_credential.external.id
  url             = "s3://${module.domain_storage.bucket_name}/"  # <=============== / is added at the end
  depends_on      = [aws_iam_role.external_data_access, time_sleep.wait_role_creation]

  force_destroy = true
  force_update  = true
}

but it would be better if the provider can handle the URL without /