Open flvndh opened 5 months ago
Thanks for reporting. This field was recently added and we haven't added the path rewriting logic.
We'll work on a fix. In the meantime you can manually perform the interpolation to unblock yourself with:
tasks:
- task_key: ingest
job_cluster_key: single_node_cluster
libraries:
- requirements: /Workspace/${workspace.file_path}/requirements.txt
spark_python_task:
python_file: ../../runner.py
parameters:
- "--execution-time"
- "{{ job.trigger.time.iso_datetime }}"
- "--environment"
- "${bundle.target}"
Hey, same thing for library path when using for_each_task
feature.
@pietern This might be also needed: https://github.com/databricks/cli/pull/1543
Describe the issue
I use DAB to deploy a Python task. I specify my Python dependencies in a
requirements.txt
file. When I deploy the bundle, the path to the requirements file is not replaced by its workspace counterpart.Configuration
Here is my job specification:
Steps to reproduce the behavior
databricks bundle deploy
Expected Behavior
The dependent libraries path should be
Actual Behavior
The dependent libraries path is
OS and CLI version
OS: Debian 12 CLI version: v0.221.1