kubeflow / pipelines

Machine Learning Pipelines for Kubeflow
https://www.kubeflow.org/docs/components/pipelines/
Apache License 2.0
3.56k stars 1.61k forks source link

extract string value of PipelineParam inside a pipeline #2725

Closed maggiemhanna closed 4 years ago

maggiemhanna commented 4 years ago

Hello,

Extracting the string value of a PipelineParam (str(param.value)) always generates None.

I would like to extract the string value of a PipelineParam inside a pipeline so that I can use it to rename my component.

Example:

@dsl.pipeline(name='my-pipeline') def pipeline(loopidy_doop: dict = [{'a': "first", 'b': 2}, {'a': "second", 'b': 20}]):

with dsl.ParallelFor(loopidy_doop) as item:
    op1 = dsl.ContainerOp(
        name="my-in-cop1",
        image="library/bash:4.4.23",
        command=["sh", "-c"],
        arguments=["echo no output global op1, item.b: %s" % item.b],
    ).set_display_name(str(item.a.value))

Is there a way kubeflow can support this kind of operations?

numerology commented 4 years ago

This is because at compile time the parameter itself does not have a value. Do you think it's a legit use case to support treating item.a as a string here @Ark-kun ?

Ark-kun commented 4 years ago
  1. PipelineParam is a bit of an implementation detail. Users should treat it as opaque reference to some future data.
  2. param.value was used to specify the default value of the pipeline parameter.

Nevertheless, even semantically the value cannot exist at the pipeline compilation time, since the value is only created during the pipeline execution.

P.S. .set_display_name(str(item.a)) might actually work, although we are not supporting this.

apeccaud commented 4 years ago

Is there any other option to give a significant name to tasks in a loop?

stale[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 4 years ago

This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.