Open YaroBear opened 5 months ago
JFYI This has to be addressed in Go SDK / API definition where spark version is defined as always required field https://github.com/databricks/databricks-sdk-go/blob/a823ca32fc4199d8cf2269b78cfe89331b4b688a/service/compute/model.go#L1544-L1547
cc @mgyucht
Describe the issue
I am trying to deploy a DAB that creates a new job cluster using the policy id for the Job Compute policy. The Job Compute policy sets this value for the spark_version:
I want to use the latest-lts spark version for my jobs if possible and not have to specify the exact version in the DAB.
Setting
spark_version: "auto:latest-lts"
in my DAB does not work and I get the following error: "INVALID_PARAMETER_VALUE: Invalid spark version auto:latest-lts." I would expect the resulting bundle.tf.json that has a line that looks similar todata.databricks_spark_version.latest.id
, using the databrick_spark_version Terraform resource in order for this to work correctly.Omitting the spark_version in my DAB, produces a bundle.tf.json with an empty string:
"spark_version": "",
and I get a similar error: "INVALID_PARAMETER_VALUE: Invalid spark version ."Are there plans for the CLI to support this use case?
Configuration
bundle.yml:
Steps to reproduce the behavior
Please list the steps required to reproduce the issue, for example:
databricks bundle deploy --var "cluster_policy_id=<job compute policy id>"
Expected Behavior
DAB should deploy to Databricks using the LTS spark version.
Actual Behavior
Clear and concise description of what actually happened
OS and CLI version
Windows 10 Databricks CLI v0.211.0
Is this a regression?
No
Debug Logs