_module.py is generated in the same manner as Airflow's so it remained
_requirements.txt as well
_dag file provides an execution file to submit workflows to argo directly using hera-workflows.
It is generated based on the argo_dag.jinja
Since python function is a unit of execution it stays pretty Airflow-ish alike.
It defines a WorkflowService, either parameterized by the user or given default parameters,
which is submitted along the Workflow instance.
Tasks and dependencies are generated in a different pattern but utilized the same
methods already providing the tasks and tasks definitions.
The service account token is searched for and appended to the WorkflowService as well.
With the command python3 {pipeline_name}_dag.py the workflow is directly submitted to the current argo environment.
_Dockerfile is fine
Current issues
Loading modules for each task could be repetitive so should be handled when dissecting tasks
Type of change
[x] New feature (non-breaking change which adds functionality)
Description
_module.py
is generated in the same manner as Airflow's so it remained_requirements.txt
as well_dag
file provides an execution file to submit workflows to argo directly using hera-workflows. It is generated based on the argo_dag.jinja Since python function is a unit of execution it stays pretty Airflow-ish alike. It defines a WorkflowService, either parameterized by the user or given default parameters, which is submitted along the Workflow instance. Tasks and dependencies are generated in a different pattern but utilized the same methods already providing the tasks and tasks definitions. The service account token is searched for and appended to the WorkflowService as well. With the commandpython3 {pipeline_name}_dag.py
the workflow is directly submitted to the current argo environment. _Dockerfile is fineCurrent issues
Type of change
How Has This Been Tested?