At least in the classic UI, when a task gets expanded dynamically, any retried failed task makes its other stats disappear.
What you think should happen instead?
No response
How to reproduce
import random
from datetime import datetime
from airflow import DAG
from airflow.operators.python import PythonOperator
def random_fail_task(task_id):
if random.random() < 0.5:
raise Exception(f"Task {task_id} failed")
print(f"Task {task_id} succeeded")
def all_success(task_id):
print(f"Task {task_id} succeeded")
with DAG(
dag_id='dynamic_task_expansion',
start_date=datetime(2023, 1, 1),
schedule_interval=None,
catchup=False,
) as dag:
tasks = PythonOperator.partial(
task_id='task',
## Uncoment below for the second coherence check
# retries=0,
python_callable=random_fail_task, # <- Change to all_success for first coherence checks
).expand(op_args=[[i] for i in range(100)])
Coherence checks
When replacing random_fail_task with random_all_success_task, we get the exact number of tasks, all succesful (100):
When we apply retries=0 with random_fail_task the total number of tasks (succesful+failed) is once again 100:
Reproduction
When running the random_fail_task task with retries>=0 - some of the mapped tasks fail and are up for retry:
Immediately when failed tasks are triggered for re-run, some of the other tasks disappear from the UI:
And finally we're left with less tasks than what we started:
When looking the the "Mapped Tasks" tab, all of the tasks still are still there.
Apache Airflow version
2.10.2
If "Other Airflow 2 version" selected, which one?
No response
What happened?
At least in the classic UI, when a task gets expanded dynamically, any retried failed task makes its other stats disappear.
What you think should happen instead?
No response
How to reproduce
Coherence checks
When replacing
random_fail_task
withrandom_all_success_task
, we get the exact number of tasks, all succesful (100):When we apply
retries=0
withrandom_fail_task
the total number of tasks (succesful+failed) is once again 100:Reproduction
When running the
random_fail_task
task withretries>=0
- some of the mapped tasks fail and are up for retry:Immediately when failed tasks are triggered for re-run, some of the other tasks disappear from the UI:
And finally we're left with less tasks than what we started:
When looking the the "Mapped Tasks" tab, all of the tasks still are still there.
Operating System
Linux
Versions of Apache Airflow Providers
No response
Deployment
Google Cloud Composer
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct