Closed rmgogogo closed 4 years ago
@jessiezcc
@Bobgy
Option-3: we provide a sample component and tutorial to demo how to use user-codes to handle notification. (It would also help mitigate the requests on providing Workflow styled requests).
Option-3 will be treated with priority.
FYI: argo has examples we can follow: https://github.com/argoproj/argo/blob/master/docs/workflow-notifications.md
I read the workflow-notifications.md. Thanks! Let me confirm my understanding.
The 2nd option "Default Workflow Spec" is available from v2.7. The 3rd option "Workflow Events" is available from v2.7.2.
However, the v0.5.1 of kubeflow/pipelines uses v2.3.0 of workflow-controller. Therefore, the 2nd and 3rd option wouldn't work. Is this correct?
the next release has been using argo 2.7.5
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.
/reopen (hope it's ok for me to reopen!)
Just wondering if this is on the radar at all?
My team is migrating more and more production ETL and model building onto KFP (it's been great! thanks for all your work!). One pain point we're running into is monitoring/alerting/observability (i.e., people don't notice that a daily pipeline failed and we've been working with stale data). It would be great if KFP supported a way to get alerted in these cases.
Doing some searching, I found this approach using ExitHandler, which I'll experiment with in the mean time: https://stackoverflow.com/questions/57508382/kubeflow-pipeline-termination-notificaiton
Hi @jli, before KFP invents its own way to notify, you can use argo's ways -- see above https://github.com/argoproj/argo/blob/master/docs/workflow-notifications.md
These are configurable transparently
Thanks @Bobgy! I will give those a try.
For the "exit handler" approach 1: is that usable in KFP via https://kubeflow-pipelines.readthedocs.io/en/stable/source/kfp.dsl.html#kfp.dsl.ExitHandler ?
For the "default workflow spec" approach 2: I see in my k8s cluster that there's a workflow-controller-configmap
- do I just need to update this value with my desired workflowDefaults
config? Is it possible that KFP or Argo would remove/update the configmap in the future and cause my configuration to be lost?
@jli
yes, you can use KFP exit handler.
No, your configmap won't be changed, unless you upgrade KFP/argo. You can maintain a kustomize overlay if you care about infra as code.
Is there any way in KFP or Argo to know which user launched a workflow? (I would like to be able to to send more targeted alerts.)
If not, are there common patterns you know of for how we can add this type of metadata to KFP?
Thanks!
@jli I don't think we have separate metadata, can you send a feature request?
Current workaround can be adding pipeline parameters that default to author of the pipeline.
The argo example link has moved to https://github.com/argoproj/argo-workflows/blob/master/docs/workflow-notifications.md
It's a request mentioned in today's community meeting. After a pipeline run got done, we can provide a notification to trigger other tasks/systems.
Without this feature, it's still can be handled via user-codes.
Option-1: Last step in pipeline write a file to some place, e.x. GCS. File watcher get the signal and trigger things outside of KFP.
Option-2: Last step in pipeline put msg to certain certain Queue (e.x. Pubsub) to trigger other work.
We will treat this FR with low priority for now but keep this ticket open to let community aware the other options and get more feedback on better proposals.