databricks / terraform-provider-databricks

Databricks Terraform Provider
https://registry.terraform.io/providers/databricks/databricks/latest
Other
456 stars 393 forks source link

[ISSUE] Issue with `databricks_permissions` resource #4171

Closed pdbarrington closed 3 weeks ago

pdbarrington commented 3 weeks ago

Configuration

resource "databricks_cluster" "xyz" {
  spark_version = "15.4.x-scala2.12"
  runtime_engine = "STANDARD"
  node_type_id   = "n2d-highmem-32"
  driver_node_type_id ="n2d-highmem-64"
  gcp_attributes {
    zone_id                = "HA"
    google_service_account = "TF_Email"
    availability           = "PREEMPTIBLE_WITH_FALLBACK_GCP"
  }
  cluster_name = "XYZ"
  cluster_log_conf {
    dbfs {
      destination = "dbfs:/cluster-logs"
    }
  }
  autotermination_minutes = 30
  autoscale {
    min_workers = 2
    max_workers = 20
  }
}

resource "databricks_permissions" "yxz_permissions" {
  cluster_id = databricks_cluster.xyz.id
  access_control {
    permission_level = "CAN_RESTART"
    group_name       = "group1"
  }
  access_control {
    permission_level = "CAN_RESTART"
    group_name       = "group2"
  }
  access_control {
    user_name        = "TF_Email"
    permission_level = "CAN_MANAGE"
  }
  access_control {
    user_name        = "email2"
    permission_level = "CAN_RESTART"
  }
}

Change:
Removed "spark.driver.memory" setting.

Plan output:

~ resource "databricks_cluster" "xyz" {
      ~ driver_node_type_id          = "n2d-highmem-32" -> "n2d-highmem-64"
        id                           = "0000-111111-abcdefgh"
      ~ spark_conf                   = {
          - "spark.driver.memory"                             = "16g" -> null
            # (16 unchanged elements hidden)
        }
        # (14 unchanged attributes hidden)

        # (5 unchanged blocks hidden)
    }

~ resource "databricks_permissions" "yxz_permissions" {
        id          = "/clusters/0000-111111-abcdefgh"
        # (2 unchanged attributes hidden)

      - access_control {
          - permission_level       = "CAN_RESTART" -> null
          - user_name              = "email2" -> null
            # (2 unchanged attributes hidden)
        }
      - access_control {
          - group_name             = "group1" -> null
          - permission_level       = "CAN_RESTART" -> null
            # (2 unchanged attributes hidden)
        }
      - access_control {
          - group_name             = "group2" -> null
          - permission_level       = "CAN_RESTART" -> null
            # (2 unchanged attributes hidden)
        }
      + access_control {
          + permission_level       = "CAN_MANAGE"
          + user_name              = "TF_Email"
            # (2 unchanged attributes hidden)
        }
      + access_control {
          + group_name       = "group1"
          + permission_level = "CAN_RESTART"
        }
      + access_control {
          + group_name       = "group2"
          + permission_level = "CAN_RESTART"
        }
      + access_control {
          + permission_level = "CAN_RESTART"
          + user_name        = "email2"
        }
    }

Expected Behavior

The terraform plan should only show changes related to removal of the spark config The terraform plan should not show any changes to cluster permissions as none were made.

Actual Behavior

The terraform plan is showing changes to cluster permissions even though none were made. This is repeated for every cluster in our configuration despite no changes being made. This output spams our plans and makes it very difficult to assess the impact of changes given the large amount of spam in the plan that needs to be carefully assessed. As you can see, all cluster permission except "TF_Email" are repealed to null and then set again even though no changes were made.

Steps to Reproduce

  1. Create a cluster via "databricks_cluster".
  2. Assign some cluster "databricks_permissions" to a user, group, or service principal.
  3. Change something on the cluster that is unrelated to cluster permissions (eg: remove a spark config or change node type)
  4. Note the terraform plan states permission changes were made to "databricks_permissions" despite the change being unrelated to cluster permissions.

Terraform and provider versions

Is it a regression?

No, I don't believe this ever worked.

Debug Output

Will include in support ticket.

Important Factoids

No

Would you like to implement a fix?

No

pdbarrington commented 3 weeks ago

1.53.0 fixes this.