opendatahub-io / data-science-pipelines-tekton

Kubeflow Pipelines on Tekton
https://developer.ibm.com/blogs/kubeflow-pipelines-with-tekton-and-watson/
Apache License 2.0
0 stars 19 forks source link

[WIP] fix: moving the hardcoded configs in KFPv2 Launcher to configmap. Fixes #494 #191

Closed amadhusu closed 10 months ago

amadhusu commented 10 months ago

Which issue is resolved by this Pull Request: Resolves #494

Description of your changes: Followed Option 1 in the suggestion mentioned in kubeflow/pipelines/issues/9689

We want to allow the above configs to be changed from their defaults, preferably on a per-namespace basis.

I propose we add these configs to the ConfigMap/kfp-launcher which already exists in each profile namespace to set defaultPipelineRoot.

For example, a new ConfigMap/kfp-launcher in a profile namespace might look like this:

data:
  defaultPipelineRoot: "minio://mlpipeline/v2/artifacts"

  ## minio endpoint config
  defaultMinioEndpoint: "minio.example.com:9000"

  ## minio auth configs
  minioAuthConfig: "mlpipeline-minio-artifact"
  minioAuthConfigAccessKey: "access_key"
  minioAuthConfigSecretKey: "access_key"

OPTION 1: have the kfp-launcher container read the ConfigMap/kfp-launcher when it executes, it can pass the minio configs down as it calls objectstore.OpenBucket

Environment tested:

Checklist:

openshift-ci[bot] commented 10 months ago

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: amadhusu

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files: - **[OWNERS](https://github.com/opendatahub-io/data-science-pipelines/blob/dspv2/OWNERS)** Approvers can indicate their approval by writing `/approve` in a comment Approvers can cancel approval by writing `/approve cancel` in a comment
gregsheremeta commented 10 months ago

Migrated to Jira: https://issues.redhat.com/browse/RHOAIENG-1628

sorry bad robot

amadhusu commented 10 months ago

Closing this PR as it has been moved to here

RichardPinter commented 5 months ago

Has this been resolved? I can connect to my external minio to write manifects, but for some logs it still tries to write to the internal minio? ailed to execute component: failed to upload output artifact "data_out" to remote storage URI "minio://mlpipeline/v2/artifacts/xxxxx/data_out": uploadFile(): unable to complete copying "/minio/mlpipeline/v2/artifacts/xxxxx/gen-sine/data_out" to remote storage "gen-sine/data_out": failed to close Writer for bucket: blob (key "gen-sine/data_out") (code=NotFound): NoSuchBucket: The specified bucket does not exist