Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
27.48k stars 2.77k forks source link

[unable loading ollama with dokcer anythingllm ]: can't loading the local Ollama with docker anything #2548

Closed ddimit2020 closed 3 weeks ago

ddimit2020 commented 3 weeks ago

How are you running AnythingLLM?

Docker (local)

What happened?

I'm also follow the step to install docker anything

  1. docker pull mintplexlabs/anythingllm
  2. export STORAGE_LOCATION=$HOME/anythingllm && \ mkdir -p $STORAGE_LOCATION && \ touch "$STORAGE_LOCATION/.env"

docker run -d -p 3001:3001 \ --add-host=host.docker.internal:host-gateway \ --cap-add SYS_ADMIN \ -v ${STORAGE_LOCATION}:/app/server/storage \ -v ${STORAGE_LOCATION}/.env:/app/server/.env \ -e STORAGE_DIR="/app/server/storage" \ mintplexlabs/anythingllm

3.also vim /etc/systemd/system/ollama.service [Service] add 2 Env. Environment="OLLAMA_HOST=0.0.0.0:11434" Environment="OLLAMA_MODELS=/www/algorithm/LLM_model/models"

  1. after restart systemctl daemon-reload systemctl restart ollama

5.than setup the anythingllm,can;t load the localhost:ollama anythingllm

also i already try URL:http://host.docker.internal:11434,still can't loading the ollama model

shoud i miss something step?

os:Ubuntu 20.04.6 ollama install the ubuntu localhost:11434 Docker version 24.0.7 anythingllm install the docker

have anything idea???

Are there known steps to reproduce?

No response

timothycarambat commented 3 weeks ago

Are you able to even reach the ollama API via the http://192.168.58.164:11434 URL in a browser? http://192.168.58.164:11434/api/tags should return a JSON response with all installed models.

If the browser cannot resolve the that location, then the container certainly cannot.

ddimit2020 commented 3 weeks ago

before your tips,actually i still can't open http://192.168.58.164:11434 and http://192.168.58.164:11434/api/tags, after i modify /etc/hosts & ufw disable,now can access & add ollama_host=192.168.58.164 ollama serve http://192.168.58.164:11434 and http://192.168.58.164:11434/api/tags, but also can't loading the ollama model, have any idea,i so confuse

the docker logs and ollama logs docker_logs

the browser still can't loading the ollama Model web

but the ollama have the Model ollama

browser ollama running running

browser ollama api(with not open the ubuntu server browser) api-tags

but in the ubuntu(ollama service and docker install anythingllm) can show the api/tags ubuntu

also i will try curl -i http://192.168.58.164:11434/api/tags & curl -i http://localhost:11434/api/tags i think it's issue point,localhost can reach the Ollama model,but use the IP can't but still don't have idea to solved api-tags-2

=========================Final========== i restore the original os and install the ollama,also reach http://192.168.58.164:11434 or http://localhost:11434 --->show the ollama running http://192.168.58.164:11434/api/tags or http://ocalhost:11434/api/tags--->show the json Models, than install the docker with anythinhllm,hope the well done~ just ing now~~~~

timothycarambat commented 3 weeks ago

I overlooked this prior, but host.docker.internal does not work on Ubuntu https://docs.anythingllm.com/ollama-connection-troubleshooting#url-detection-failed-when-manual-endpoint-input-is-expanded-the-url-was-not On "Docker" tab

Docker Version
Windows/macOS: http://host.docker.internal:11434
Linux: http://172.17.0.1:11434
On Linux, use http://172.17.0.1:11434 as host.docker.internal doesn't work.

Have you tried http://172.17.0.1:11434 in the URL? This maps localhost in the container to the host, which now since localhost works - this should too.