Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
24.73k stars 2.5k forks source link

[BUG]: AnythingLLM Stuck at "--loading available models--" #1338

Closed cope closed 5 months ago

cope commented 5 months ago

How are you running AnythingLLM?

Docker (local)

What happened?

I started Ollama with docker: docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama I then loaded some models:

ollama pull llama3:8b-instruct-q8_0
ollama pull mxbai-embed-large:335m
ollama pull codellama:7b-code-q8_0

Then I started AnythingLLM with docker: docker run -d -p 3001:3001 --cap-add SYS_ADMIN -e STORAGE_DIR="/server/storage" -v "D:/db/anythingllm:/server/storage" -v "D:/db/anythingllm/env:/server/.env" --name anything-llm mintplexlabs/anythingllm:master

Then I opened http://localhost:3001 and went to Get Started, selected Ollama and entered http://localhost:11434 and it does not load the available models, it is stuck at: image

http://localhost:11434 shows: image

http://localhost:11434/api/tags shows: image

Are there known steps to reproduce?

I described all the steps above.

cope commented 5 months ago

heh... I forgot I was in docker... needed to use http://host.docker.internal:11434 instead of http://localhost:3001 and it worked.

Closing for beeing dumb.