elementary-data / elementary

The dbt-native data observability solution for data & analytics engineers. Monitor your data pipelines in minutes. Available as self-hosted or cloud service with premium features.
https://www.elementary-data.com/
Apache License 2.0
1.92k stars 165 forks source link

Orchestrator support: Argo Workflows #1162

Open menzenski opened 1 year ago

menzenski commented 1 year ago

Is your feature request related to a problem? Please describe.

We run dbt-core (within Meltano - see https://github.com/elementary-data/elementary/issues/1160 ) using Argo Workflows. Per https://docs.elementary-data.com/deployment-and-configuration/collect-job-data#cant-find-your-orchestrator-missing-info I'm opening this issue because elementary doesn't currently support an Argo Workflows orchestrator.

Describe the solution you'd like

It would be great to have Argo Workflows metadata captured in the same orchestrator metadata model as others:

Orchestrator name: argo_workflows Job name: job_name (Argo Workflows name) Job ID: n/a ? Job results URL: n/a ? The ID of a specific run execution: job_run_id (Argo Workflows uid ) Job run results URL: n/a ?

Describe alternatives you've considered not sure

Additional context

Would you be willing to contribute this feature? Yes!

menzenski commented 1 year ago

It seems to be possible to pass Argo Workflows orchestrator information today:

dbt run --select elementary --vars '{orchestrator: "Argo Workflows", job_id: "{{workflow.name}}", job_run_id: "{{workflow.uid}}"}'