When invoked a legacy ELT schedule, the schedule is executed via meltano schedule run which takes care of populating env if one is specified on the schedule, but job's are executed directly via meltano run and have no idea that they're being executed as part of a schedule.
We could (should?) update the dag generator to inject the env when calling meltano run. Should be as simple as just extending the env and adding the schedule env dict to the BashOperators env:
task = BashOperator(
task_id=task_id,
bash_command=f"cd {PROJECT_ROOT}; {MELTANO_BIN} run {run_args}",
dag=dag,
env=schedule.get("env", {}), # snag an env from the schedule
append_env=True, # append , don't replace the inherited env
)
@aaronsteers added to the engineering assignments board. This is a quick fix that closes a big gap on expectations around schedules between run vs elt.
When invoked a legacy ELT schedule, the schedule is executed via
meltano schedule run
which takes care of populating env if one is specified on the schedule, but job's are executed directly viameltano run
and have no idea that they're being executed as part of a schedule.We could (should?) update the dag generator to inject the env when calling meltano run. Should be as simple as just extending the env and adding the schedule env dict to the BashOperators env:
slack ref: https://meltano.slack.com/archives/C01TCRBBJD7/p1663679958598429