bitnami / charts

Bitnami Helm Charts
https://bitnami.com
Other
8.81k stars 9.1k forks source link

[bitnami/mlflow] Multiple replicas for mlflow with PVC fails #28495

Closed SachinMaharana closed 3 weeks ago

SachinMaharana commented 1 month ago

Name and Version

bitnami/mlflow 1.4.11

What architecture are you using?

amd64

What steps will reproduce the bug?

image:
  registry: docker.io
  repository: bitnami/mlflow
  tag: 2.14.1-debian-12-r1

postgresql:
  enabled: false

minio:
  enabled: false

run: 
  resources:
    requests:
      cpu: 50m
      memory: 120Mi
    limits:
      memory: 256Mi

tracking: 
  replicaCount: 2
  resources:
    requests:
      cpu: 50m
      memory: 500Mi
    limits:
      memory: 1200Mi
  command: [ "/bin/sh", "-c" ]
  args:
    - >
      mlflow server
      --host=0.0.0.0 --port=5000 --app-name=basic-auth --serve-artifacts
      --artifacts-destination=gs://<>
      --backend-store-uri=postgresql://<>

when applying this helm chart with the above values, we get multiple attach volume error. Note: we are using gcs as an artifact storage and postgresql as backend store.

Are you using any custom parameters or values?

No.

What is the expected behavior?

I want to have a HA instance of mlflow, which would be possible if we enable multiple replicas.

What do you see instead?

Deployment failing with Multiple Volume attach error, as multiple pods try to bind to same PV

Additional information

so as to confirm, if we are using gcs an artifact store and external postgres as backend store, do we even need Persistence Enabled with PVC.

javsalgar commented 1 month ago

Hi,

Could you try deploying disabling the PVC creation? tracking.persistence.enabled=false If the artifact storage is configured, the PVC may not be necessary.

github-actions[bot] commented 4 weeks ago

This Issue has been automatically marked as "stale" because it has not had recent activity (for 15 days). It will be closed if no further activity occurs. Thanks for the feedback.

github-actions[bot] commented 3 weeks ago

Due to the lack of activity in the last 5 days since it was marked as "stale", we proceed to close this Issue. Do not hesitate to reopen it later if necessary.