Closed kenmyers-8451 closed 3 days ago
Thanks for reporting the issue, @kenmyers-8451 !
Are you sure the job you're trying to reference has the key wf_dimensions
?
This is supposed to work out of the box.
If possible, please share the output of databricks bundle validate --output json
or confirm that it includes that job.
@pietern thanks! I gave it another shot today from scratch and it seems to be working. I think probably I screwed up something in my include because I was trying to import from another directory. I more explicitly stated those paths and it seems to be working now
Describe the issue
My team has adopted the recent complex variable addition from 0.222.0 which has helped a bit for cleaning up our main bundle file. However, I was trying to split up our job resources into their own files and I had an issue. Our jobs are highly nested with one job calling other jobs through
run_job_task
and it seems if the sub-job didn't exist in the "root module" then it would throw an error during deployment.Configuration
make a file structure like
in bundle.yml include all bundle*.yml
in job2 have it perform a
run_job_task
of job1 referencing it by${resources.jobs.job1.id}
job1 can just be a simple run of a notebook
Steps to reproduce the behavior
Please list the steps required to reproduce the issue, for example:
databricks bundle deploy
Expected Behavior
I expected that all of the resources would be merged together so that (in the above example) job2 would be able to reference job1 and deploy without issue.
Actual Behavior
You get an error like the following which says the sub job must be defined in the root module (I assume the root bundle file):
(this error I copied from our actual error rather than going by the example configuration I gave above).
OS and CLI version
macos 14.3.1 (23D60) 0.222.0
Is this a regression?
n/a
Debug Logs
can provide if needed