Open estahn opened 6 years ago
Determining how many tasks are running is not possible (yet at least...) You can however use the airflow_dag_run_state
metric to see how many and when certain events happened (success
, failed
, running
)
I think that task duration would be way too much data to export, since it would need a separate metric/dimension for every task instance. Maybe a different approach to this would be possible by counting the number of seconds taken and the number of tries or something... But it isn't possible at this moment.
@DemonTPx We want to use the metric to scale up pods in Kubernetes. Any idea what metric would be useful for that? We thought something like queue size.
We use the GCP Composer and the Stackdriver Exporter for that metric. But that does not help you, if you are using a native Aiflow deployment. https://cloud.google.com/monitoring/api/metrics_gcp#gcp-composer
Questions: