Then I started AnythingLLM with docker:
docker run -d -p 3001:3001 --cap-add SYS_ADMIN -e STORAGE_DIR="/server/storage" -v "D:/db/anythingllm:/server/storage" -v "D:/db/anythingllm/env:/server/.env" --name anything-llm mintplexlabs/anythingllm:master
How are you running AnythingLLM?
Docker (local)
What happened?
I started Ollama with docker:
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
I then loaded some models:Then I started AnythingLLM with docker:
docker run -d -p 3001:3001 --cap-add SYS_ADMIN -e STORAGE_DIR="/server/storage" -v "D:/db/anythingllm:/server/storage" -v "D:/db/anythingllm/env:/server/.env" --name anything-llm mintplexlabs/anythingllm:master
Then I opened http://localhost:3001 and went to Get Started, selected Ollama and entered http://localhost:11434 and it does not load the available models, it is stuck at:
http://localhost:11434 shows:
http://localhost:11434/api/tags shows:
Are there known steps to reproduce?
I described all the steps above.