ivanfioravanti / chatbot-ollama

Chatbot Ollama is an open source chat UI for Ollama.
Other
1.33k stars 217 forks source link

[TypeError: fetch failed] { cause: [Error: AggregateError] } when loading UI hosted on remote host #54

Closed dav-ell closed 2 weeks ago

dav-ell commented 3 weeks ago

I'm hosting the UI on a remote machine like so

export OLLAMA_HOST=0.0.0.0:11434
ollama serve

docker build -t chatbot-ollama .
docker run -p 3000:3000 -e DEFAULT_MODEL="llama3:latest" -e OLLAMA_HOST="http://localhost:11434" chatbot-ollama

and then I've port forwarded port 3000 and port 11434 to my local machine and went to localhost:3000/ in the browser. This resulted in the below errors when I opened the web page.

$ docker run -p 3000:3000 --add-host=host.docker.internal:host-gateway -e DEFAULT_MODEL="llama3:latest" chatbot-ollama

> chatbot-ollama@0.1.0 start
> next start

   ▲ Next.js 14.1.0
   - Local:        http://localhost:3000

 ✓ Ready in 2.4s
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }
[TypeError: fetch failed] { cause: [Error: AggregateError] }

And in the browser logs

Failed to load resource: the server responded with a status of 500 (Internal Server Error)
debug.js:87 (3)(+0000000):  No suitable translators found

debug.js:87 (5)(+0000000): Translate: Running handler 0 for translators

:3001/api/models:1 

       Failed to load resource: the server responded with a status of 500 (Internal Server Error)
:3001/api/chat:1 

       Failed to load resource: the server responded with a status of 500 (Internal Server Error)
models:1 

       Failed to load resource: the server responded with a status of 500 (Internal Server Error)
dav-ell commented 2 weeks ago

I fixed this using these commands on the remote

docker run -it --net host -e DEFAULT_MODEL="llama3:70b" -e OLLAMA_HOST="http://localhost:11434"  chatbot-ollama
sudo su  # was getting permission issues without running ollama as root
root$ export OLLAMA_HOST=0.0.0.0:11434
root$ ollama serve

Then on my client

ssh -L 11434:localhost:11434 -L 3000:localhost:3000 user@remote

And go to http://localhost:3000 in your browser.

The --net host was what seemed to be required. Unclear why, though.