ivanfioravanti / chatbot-ollama

Chatbot Ollama is an open source chat UI for Ollama.
Other
1.41k stars 233 forks source link

Can't connect to the ollama server at ECONNREFUSED #21

Open gerazov opened 10 months ago

gerazov commented 10 months ago

I set the OLLAMA_HOST like so:

docker run -d -p 3000:3000 -e OLLAMA_HOST="http://127.0.0.1:11434/" --name chatbot-ollama ghcr.io/ivanfioravanti/chatbot-ollama:main

and can't connect the app to the server. docker logs chatbot-ollama reads:

> chatbot-ollama@0.1.0 start
> next start

  ▲ Next.js 13.5.4
  - Local:        http://localhost:3000

 ✓ Ready in 236ms
 [TypeError: fetch failed] {
  cause:  [Error: connect ECONNREFUSED 127.0.0.1:11434] {
  errno: -111,
  code: 'ECONNREFUSED',
  syscall: 'connect',
  address: '127.0.0.1',
  port: 11434
}
}
jowilf commented 10 months ago

If ollama is running on your host you have to remove -e OLLAMA_HOST="http://127.0.0.1:11434/" or replace with OLLAMA_HOST="http://host.docker.internal:11434/" which is the default value.

gerazov commented 10 months ago

I'm not using Docker to server ollama. I've used their default install script which sets it up as a systemd service.

It's also available on http://127.0.0.1:11434/ - it echos Ollama is running if I go to that url in Firefox, I can also use it via other UIs (https://github.com/ollama-webui/ollama-webui) and neovim plugins (https://github.com/nomnivore/ollama.nvim).

ivanfioravanti commented 10 months ago

@jowilf is right, 127.0.0.1 is not reachable from within Dcoker, so nextjs app can't reach you Ollama instance running locally. ollama-webui (great project!) is probably running locally in your machine, same for neovim plugins. Please test and let us know if it works.

gerazov commented 10 months ago

Hmm, but ollama-webui is running within Docker as per their instructions on Installing Ollama Web UI Only :thinking:

VertigoOne1 commented 10 months ago

You can use '--network host' as part of the docker run to force the container onto the host network instead of getting isolated, which allows it to access localhost when the systemd ollama is in use.

example

docker run -e OLLAMA_HOST="http://localhost:11434" -p 3000:3000 --network host --name chat ghcr.io/ivanfioravanti/chatbot-ollama:main

is how i'm running it right now. Pretty good!

gerazov commented 10 months ago

That's awesome :love_you_gesture: maybe this can be added to the README?