astronomer / astro-provider-databricks

Orchestrate your Databricks notebooks in Airflow and execute them as Databricks Workflows
Apache License 2.0
21 stars 11 forks source link

Marking a running DAG/task as failed has no effect on the ongoing Databricks job run #1

Open pankajkoti opened 1 year ago

pankajkoti commented 1 year ago

Within Airflow UI, when a Databricks workflow job is run and a DAG or task is marked failed while the job is still running, it gets marked failed within Airflow, but the ongoing Databricks job run is not cancelled/killed and continues processing.

SenthilMalli commented 8 months ago

Am also having this issue, do we have any resolution for this?