Closed ddimit2020 closed 3 days ago
Are you able to even reach the ollama API via the http://192.168.58.164:11434
URL in a browser?
http://192.168.58.164:11434/api/tags should return a JSON response with all installed models.
If the browser cannot resolve the that location, then the container certainly cannot.
before your tips,actually i still can't open http://192.168.58.164:11434 and http://192.168.58.164:11434/api/tags, after i modify /etc/hosts & ufw disable,now can access & add ollama_host=192.168.58.164 ollama serve http://192.168.58.164:11434 and http://192.168.58.164:11434/api/tags, but also can't loading the ollama model, have any idea,i so confuse
the docker logs and ollama logs
the browser still can't loading the ollama Model
but the ollama have the Model
browser ollama running
browser ollama api(with not open the ubuntu server browser)
but in the ubuntu(ollama service and docker install anythingllm) can show the api/tags
also i will try curl -i http://192.168.58.164:11434/api/tags & curl -i http://localhost:11434/api/tags i think it's issue point,localhost can reach the Ollama model,but use the IP can't but still don't have idea to solved
=========================Final========== i restore the original os and install the ollama,also reach http://192.168.58.164:11434 or http://localhost:11434 --->show the ollama running http://192.168.58.164:11434/api/tags or http://ocalhost:11434/api/tags--->show the json Models, than install the docker with anythinhllm,hope the well done~ just ing now~~~~
I overlooked this prior, but host.docker.internal does not work on Ubuntu https://docs.anythingllm.com/ollama-connection-troubleshooting#url-detection-failed-when-manual-endpoint-input-is-expanded-the-url-was-not On "Docker" tab
Docker Version
Windows/macOS: http://host.docker.internal:11434
Linux: http://172.17.0.1:11434
On Linux, use http://172.17.0.1:11434 as host.docker.internal doesn't work.
Have you tried http://172.17.0.1:11434 in the URL? This maps localhost in the container to the host, which now since localhost works - this should too.
How are you running AnythingLLM?
Docker (local)
What happened?
I'm also follow the step to install docker anything
docker run -d -p 3001:3001 \ --add-host=host.docker.internal:host-gateway \ --cap-add SYS_ADMIN \ -v ${STORAGE_LOCATION}:/app/server/storage \ -v ${STORAGE_LOCATION}/.env:/app/server/.env \ -e STORAGE_DIR="/app/server/storage" \ mintplexlabs/anythingllm
3.also vim /etc/systemd/system/ollama.service [Service] add 2 Env. Environment="OLLAMA_HOST=0.0.0.0:11434" Environment="OLLAMA_MODELS=/www/algorithm/LLM_model/models"
5.than setup the anythingllm,can;t load the localhost:ollama
also i already try URL:http://host.docker.internal:11434,still can't loading the ollama model
shoud i miss something step?
os:Ubuntu 20.04.6 ollama install the ubuntu localhost:11434 Docker version 24.0.7 anythingllm install the docker
have anything idea???
Are there known steps to reproduce?
No response