databricks / terraform-provider-databricks

Databricks Terraform Provider
https://registry.terraform.io/providers/databricks/databricks/latest
Other
424 stars 366 forks source link

[ISSUE] Issue with `databricks_cluster` resource , spark_version is not optional when use policy_id #3671

Open wasicode01 opened 3 weeks ago

wasicode01 commented 3 weeks ago

Hi there,

I am working with terrform v1.2.1

Expected Behavior

That cluster creation be allowed without spark_version being mandatory when policy_id is present The API databricks does allow me not to specify the spark_version when the policy_id is present

Actual Behavior

Currently the spark_version field is mandatory even though it has the policy_id:

https://github.com/databricks/terraform-provider-databricks/blob/main/clusters/resource_cluster.go :

`s.SchemaPath("spark_version").SetRequired()`

Steps to Reproduce

  1. terraform apply

Terraform and provider versions

Terraform version v1.2.1

Is it a regression?

No

Debug Output

│ Error: Missing required argument │ │ on modules/databricks_cluster/main.tf line 20, in resource "databricks_cluster" "cluster_all_purpose": │ 20: resource "databricks_cluster" "cluster_all_purpose" *│ The argument "spark_version" is required, but no definition was found.**

alexott commented 3 weeks ago

This is explicitly called out in the docs:

The primary use for cluster policies is to allow users to create policy-scoped clusters via UI rather than sharing configuration for API-created clusters. For example, when you specify policy_id of external metastore policy, you still have to fill in relevant keys for spark_conf. If relevant fields aren't filled in, then it will cause the configuration drift detected on each plan/apply, and Terraform will try to apply the detected changes.

wasicode01 commented 3 weeks ago

Hi @alexott Of course that indicates the terraform documentation, however the databricks API does allow you to create a cluster without specifying the spark_version, then why not allow the same in terraform?:

for API:

{
  "cluster_name": "single-node-cluster",
  "node_type_id": "Standard_DS3_v2",
  "policy_id" : "001146C0333EB9A4",
  "num_workers": 0,
  "apply_policy_default_values" :true,
  "custom_tags": {
    "ResourceClass": "SingleNode"
  }
}