Open fjakobs opened 2 months ago
Hi, I found this workaround to solve it, declare an array of task and sort by task_key
resource "databricks_job" "job" {
...
dynamic "task" {
for_each = local.task_sorted_list
content {
...
}
}
}
variable "task" {
type = list(object({
task_key = string
notebook_path = string
base_parameters = optional(map(string), {})
depends_on = optional(list(string), [])
run_if = optional(string, "ALL_SUCCESS")
max_retries = optional(number, 0)
min_retry_interval_millis = optional(number, 0)
retry_on_timeout = optional(bool, false)
timeout_seconds = optional(number, 0)
}))
}
locals {
task_sorted_keys = distinct(sort(var.tasks[*].task_key))
task_sorted_list = flatten([for value in local.task_sorted_keys : [for elem in var.tasks : elem if value == elem.task_key]])
}
Hope this helps :)
This can only be properly addressed when we move to the plugin framework, when we will have much more control over the generated plan and the ability to inspect the config directly.
I have Terraform config with a job with two Python file tasks that never reaches a stable state. On each
apply
, Terraform updates the first task to match the second task defined in the config and updates the second task to match the first one.Together with #3951 this leads to a situation where a task can become
source = GIT
even though this is not expected.I learned from @mgyucht that this doesn't happen if the tasks are ordered in order in the terraform config.
Configuration
Expected Behavior
The second call to
terraform apply
should detect no changes and be a no-op.Actual Behavior
terraform apply
will always detect changes like this:Terraform and provider versions
TF provider 1.50.0 Terraform v1.5.7