AugustDev / enchanted

Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
Apache License 2.0
3.37k stars 200 forks source link

No response when using ollama behind an nginx proxy #52

Open rwatts3 opened 6 months ago

rwatts3 commented 6 months ago

I am running ollama on my Google compute instance behind an nginx proxy. I can navigate to the endpoint and confirm /api/tags returns a response as well as other endpoints such as /api/version.

Unfortunately when I add my endpoint via enchanted's settings, I can send a chat, but I never receive a response. I also confirmed that the request is received by ollama via the logs on the server.

AugustDev commented 6 months ago

Hi @rwatts3 if you could post a minimum set of instructions to reproduce your environment I'd be happy to go and debug the issue.

2theo67 commented 5 months ago

Hello, I have the same issue as @rwatts3. I can't tell his setup, however, I can explain mine so it might help you @AugustDev !

Here is my nginx reverse proxy configuration for exposing the API :

upstream ollama {
    server 127.0.0.1:11434;
}

server {
    listen 80;
    listen 443 ssl http2;

    ssl_protocols   TLSv1.3;
    ssl_ciphers     ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384;
    ssl_prefer_server_ciphers off;

    if ($scheme = http) {
        return 301 https://$host$request_uri;
    }

    location /api/ { # I've tried with and without / at the end of api, but same result
        proxy_pass http://ollama;
        proxy_redirect off;
        proxy_set_header Host       localhost:11434;
    }
}

I'm using a self-signed certificate, and the endpoint configured inside your app is https://<IP>/api When going to https://<IP>/api/tags from my computer for example, it is returning a result.

Maybe the issue is because I'm using self-signed certificates ?

Well, it seems that not, since I've just disabled https and I'm able to get an answer from http://<IP>/api/tags from my computer. Inside your app, I then tried these URL without anything working :

Between each changes, I'm restarting the app as well. The app is running on iOS 17.4.1 Echanted version : 1.6.4 (I'm trusting the version number shown by the AppStore) Ollama version : 0.1.32

If you need more informations don't hesitate to ask, will try to do my best !

kmanan commented 3 months ago

hi, were you all able to get the app working? I have exposed ollama web ui to the internet via nginx at gpt.domain.com but when i enter that as ollama URL, the app doesn't do anything. No error either

doakyz commented 2 weeks ago

I'm able to confirm that I'm also seeing issues with running nginx.

Here's part of my nginx config,

upstream chat-llm {
    server 192.168.50.99:11434;
  }

  server {
    server_name chat.domain.com;
    location / {
      proxy_pass http://127.0.0.1:3123; #OpenwebUI 
     }
     location /llm/ {
     proxy_pass http://chat-llm/;
     proxy_connect_timeout  300;
     proxy_send_timeout          300;
     proxy_read_timeout          300;
     send_timeout                300;
}

I'm able to confirm that ollama API is being serverd correctly, and is actually working just fine with other tools like OpenWeb-UI

You can see in the screenshot that OpenWeb-UI can answer the same prompt in full, where as Enchanted simply gets cut off

image

I've been able to recreate this with multiple different instances of Ollama being run on various IPs and different proxy configs.