Closed matthewchoy-envisso closed 1 month ago
@matthewchoy-envisso Thanks for reporting this issue. We'll investigate whether we can support overriding parameters like you propose.
Meanwhile, to unblock you, you can leverage complex variables to define the parameters and override them in a target. Docs: https://docs.databricks.com/en/dev-tools/bundles/settings.html#complex-variables
@matthewchoy-envisso
I had the same issue and ended up removing job parameters from job template.
so in /resources/my_job.yml block parameters
its completely removed and these parameters defined only in databricks.yml.
@andrewnester its behave the same with health rules block, probably because in both cases rule and paramter is an object? Currently as workaround we are doing the same as with job parameters, simly removing whole block from job template.
example resources/my_job.yml
resources:
jobs:
my_job:
name: "My job"
health:
rules:
- metric: RUN_DURATION_SECONDS
op: GREATER_THAN
value: 5400
databricks.yml
targets:
test:
resources:
my_job:
health:
rules:
- metric: RUN_DURATION_SECONDS
op: GREATER_THAN
value: 7200
Output:
Deployment complete!
Error: terraform apply: exit status 1
Error: cannot update job: Duplicate (metric,op) pairs: (RUN_DURATION_SECONDS, GREATER_THAN) found in health rules
with databricks_job.unity_catalog_maintenance,
on bundle.tf.json line 500, in resource.databricks_job.my_job:
500: }
@blood-onix Thanks for reporting. You're right in that this is a similar issue as job parameters, but it's a more difficult one to resolve. It looks like the health rules depend on a tuple of (metric, op) rather than a singular key.
For job parameters, we can make a change where the override takes precedence over the base definition. This is the same approach we take for job clusters and tasks. The override is merged into the base definition if the keys in the list match. This means you can specialize only the fields you need to specialize and don't need to copy the whole configuration block.
@blood-onix You could work around this by defining a variable called run_duration_threshold
and settings its value differently for different targets. Then you can refer to this variable just once in a health rule definition.
While the same is possible for job parameters, I have made a change to allow native overrides for job parameters in #1659 and this will be part of the next CLI release.
Describe the issue
I want to override a job parameter default value in a target configuration, so that when deploying the job to that target, the job parameter has a different default value.
Configuration
Have a job with a parameter. Try to override the job parameter in the target
Steps to reproduce the behavior
Deploy the job to the target.
Expected Behavior
Job deploys with new job parameter default.
Actual Behavior
OS and CLI version
Please provide the version of the CLI (eg: v0.1.2) and the operating system (eg: windows). You can run databricks --version to get the version of your Databricks CLI
Is this a regression?
Did this work in a previous version of the CLI? If so, which versions did you try?
Debug Logs
Output logs if you run the command with debug logs enabled. Example: databricks bundle deploy --log-level=debug. Redact if needed