-
as a user of TDP, I would like to get all component status after cluster has been deployed.
proposition : use ansible service_facts
-
### Pre-requisites
- [X] I have double-checked my configuration
- [X] I have tested with the `:latest` image tag (i.e. `quay.io/argoproj/workflow-controller:latest`) and can confirm the issue stil…
-
### So I created this DAG with the following CLUSTER_CONFIG.
> DAG_ID = "dataproc_pyspark_test"
> PROJECT_ID = "project_id"
> CLUSTER_NAME = "simplesparkjob-airflow-cluster"
> REGION = "us-east…
-
Continuing the work done in #209, add enhanced callback support across all callback types, both at the DAG-level, as well as at the task-level.
-
I'm trying to push data test failures to S3. To do this, I'm using the documentation here (https://astronomer.github.io/astronomer-cosmos/configuration/cosmos-conf.html#remote-target-path) to set up t…
-
**Context**
On 8 March 2024, Stephen Tang posted in the #airflow-dbt Slack channel ([link to the thread](https://apache-airflow.slack.com/archives/C059CC42E9W/p1709931389182029)):
> Is it poss…
-
Skriv kommentarer, skriv manus til videon og integrer point systemet til programmet.
-
# läsa igenom labben: noggrant!!!
# Läs igenom filerna i projektet, försök förstå koden!
# Python
# förstå algoritmerna!!!!!
# Frågor inför handledning:
-
Source: https://github.com/matrix-org/GSoC/blob/master/IDEAS.md
-
Currently, as of 0.2.1, when running a DAG with multiple @task.ray, all the tasks are reusing the same ray cluster. They should create separate and independent clusters.