open-webui / helm-charts

53 stars 37 forks source link

Bad service name resolution importing model in k8s #19

Closed angelocorreia27 closed 3 months ago

angelocorreia27 commented 4 months ago

Action: Importing modelfiles Error:

File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 519, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='open-webui-ollama.xxxx.svc.cluster.local', port=11434): Max retries exceeded with url: /api/create (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7f5cef705e50>: Failed to resolve 'open-webui-ollama.xxxx.svc.cluster.local' ([Errno -2] Name or service not known)")) INFO: 10.32.0.1:0 - "POST /ollama/api/create HTTP/1.1" 500 Internal Server Error

kubernetes version: 1.27.2

It seems like the charts are assuming 'open-webui-ollama' as the name of the Ollama service and do not recognize a different name.

0xThresh commented 4 months ago

Hi @angelocorreia27, am I correct in assuming that you used the built-in Ollama config with ollama.enabled = true? Could you drop me your values.yaml file so I can try to replicate the issue?

angelocorreia27 commented 4 months ago

Thank you for taking the time to analyze the issue. I am indeed using the built-in Ollama config with ollama.enabled = true.

However, I encountered a problem when deploying via ArgoCD. ArgoCD uses the metadata.name field from the manifest to create the service name. Since your manifest expects the service name for Ollama to be open-webui-ollama, this naming convention causes a conflict in my ArgoCD deployment setup.

Here is my current values.yaml configuration: apiVersion: argoproj.io/v1alpha1 kind: Application metadata: name: not-allowing-different-name-then-sample-ollama namespace: argo spec: destination: namespace: angelocorreia27gmailcom server: "https:xxx.xx" source: path: ollama repoURL: "https://xxx" targetRevision: HEAD helm: parameters:

To work around this issue, I deployed Ollama and the OpenWebUI separately. After deploying them independently, I defined the Ollama endpoint within the WebUI configuration. Thank you.

0xThresh commented 4 months ago

Thanks for the extra context. I think in the case of your ArgoCD deployment, you'd want to update your clusterDomain value in values.yaml to match the service name Argo would deploy.

In the case of using a separate Ollama backend (which is the setup I use as well), you can also opt to update the ollamaUrls value to your Ollama service(s) name(s) so that it's loaded into Open WebUI correctly on startup. I have an example of that on a public repo here: https://github.com/0xThresh/self-hosted-genai/blob/main/helm.tf#L114

angelocorreia27 commented 3 months ago

Sure. Thank you.

0xThresh commented 3 months ago

I found that this was a legitimate issue that resulted from one of the helper values in our chart trying to define a service name in the Open WebUI deployment while the Ollama chart's ollama.name value was still in use to name the Ollama service, causing a mismatch when you used the integrated Ollama without manually setting ollamaBaseUrls. I've implemented a fix for this in a PR I'll be submitting soon.

Thanks again for reporting it, and sorry I didn't catch it before.