8gears / n8n-helm-chart

A Kubernetes Helm chart for n8n - a workflow automation tool. Easily automate tasks across different services on self hosted onKubernetes
https://artifacthub.io/packages/helm/open-8gears/n8n
Apache License 2.0
211 stars 104 forks source link

setting autoscaling.enabled to true disables the workers replicas #87

Open maozza opened 6 months ago

maozza commented 6 months ago

I may be missing something but when setting autoscaling.enabled to true the replicas count is not been configured

Source: https://github.com/8gears/n8n-helm-chart/blob/master/templates/deployment.worker.yaml

  {{- if not .Values.autoscaling.enabled }}
  replicas: {{ .Values.scaling.worker.count }}
  {{- end }}

The same issue with webhook deployment

Vad1mo commented 6 months ago

What's your config? Did you check the indentation?

maozza commented 6 months ago

Here is my config (values.yaml) Setting up autoscaling for the main n8n pod, disable the replicas for the webhook and worker. Is the if statement in line 9 in /templates/deployment.worker.yaml and in /templates/deployment.webhooks.yaml is a mistake?

autoscaling:
  enabled: true
  minReplicas: 2
  maxReplicas: 3
  targetCPUUtilizationPercentage: 80
  # targetMemoryUtilizationPercentage: 80

scaling:
  enabled: true
  worker:
    count: 10
    concurrency: 50
  # With .Values.scaling.webhook.enabled=true you disable Webhooks from the main process but you enable the processing on a different Webhook instance.
  # See https://github.com/8gears/n8n-helm-chart/issues/39#issuecomment-1579991754 for the full explanation.
  webhook:
    enabled: true
    count: 3
mhkarimi1383 commented 3 months ago

@Vad1mo

there is a problem in template of hpa that makes hpa to target only the main deployment

also

  {{- if not .Values.autoscaling.enabled }}
  replicas: {{ .Values.scaling.worker.count }}
  {{- end }}

makes it to omit the count value, I think we have to have something like scaling.worker.autoscaling or workerAutoscaling in values (for both worker and webhook)

mhkarimi1383 commented 3 months ago

Hi @maozza Fixed in #112, You can test that to see if it could help