Open gerazov opened 10 months ago
If ollama is running on your host you have to remove -e OLLAMA_HOST="http://127.0.0.1:11434/"
or replace with OLLAMA_HOST="http://host.docker.internal:11434/"
which is the default value.
I'm not using Docker to server ollama. I've used their default install script which sets it up as a systemd service.
It's also available on http://127.0.0.1:11434/
- it echos Ollama is running
if I go to that url in Firefox, I can also use it via other UIs (https://github.com/ollama-webui/ollama-webui) and neovim plugins (https://github.com/nomnivore/ollama.nvim).
@jowilf is right, 127.0.0.1 is not reachable from within Dcoker, so nextjs app can't reach you Ollama instance running locally. ollama-webui (great project!) is probably running locally in your machine, same for neovim plugins. Please test and let us know if it works.
Hmm, but ollama-webui is running within Docker as per their instructions on Installing Ollama Web UI Only :thinking:
You can use '--network host' as part of the docker run to force the container onto the host network instead of getting isolated, which allows it to access localhost when the systemd ollama is in use.
example
docker run -e OLLAMA_HOST="http://localhost:11434" -p 3000:3000 --network host --name chat ghcr.io/ivanfioravanti/chatbot-ollama:main
is how i'm running it right now. Pretty good!
That's awesome :love_you_gesture: maybe this can be added to the README?
I set the
OLLAMA_HOST
like so:and can't connect the app to the server.
docker logs chatbot-ollama
reads: