Closed yeeeuw closed 3 months ago
Which model you are using? could you provide your .env file?
below is .env
LLM=llama2 #or any Ollama model tag, gpt-4, gpt-3.5, or claudev2 EMBEDDING_MODEL=sentence_transformer #or google-genai-embedding-001 openai, ollama, or aws
NEO4J_URI=neo4j://database:7687 NEO4J_USERNAME=neo4j NEO4J_PASSWORD=password
OLLAMA_BASE_URL=http://host.docker.internal:11434
GOOGLE_API_KEY=
It seems weird, llama2 should work. What's your run time environment and docker-compose.yml then?
***edited Have you install ollama in your host machine? OLLAMA_BASE_URL=http://host.docker.internal:11434/ <= here is calling the ollama in your host machine in container
p.s. I have encounter the same problem while using gemma2:2b. I fix it following this guide. You might also install the ollama in your host machine and connect it from container.
oki, did two things, confirmed that ollama_base_url is set to http://llm:11434 same error as before.
When running ollama locally and following the guide you sited, i get an error when chatting to the bot on: http://localhost:8501/
ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x73f8ba4db190>: Failed to establish a new connection: [Errno 111] Connection refused'))
This is with ollama_base_url set to: http://host.docker.internal:11434
ConnectionError
The ConnectionError is showing that it can't connect to your local ollama. Is your local ollama running? Please check it in your terminal by
curl 127.0.0.1:11434
You should get a "ollama is running" message.
If it is not running, start it by
ollama serve
re previous comment: ollama is running but i believe you may have issues back referencing the host using host.docker.internal on linux. I've subsequently tried this on windows which has the exact same error as the original issue around $HOME. Also on windows if i follow the guide you referenced docker compose up doesn't work as the api fails its health check. Disabling the health check seems to generate further issues.
tracked it down to a computer issue, this is working
tracked it down to a computer issue, this is working
Hi, I find that might be due to the configuration of install scripts does not comply with the latest ollama docker image. Try using the ollama image version before June, together with llama2.
FYI: I am currently still running it as per your guide, haven't got ollama to work in docker.
also which operating system do you use? i don't believe host.docker.internal works on linux.
also which operating system do you use? i don't believe host.docker.internal works on linux.
I am using Ubuntu 24.04 in WSL. Haven't tried running it on Linux natively.
Hi, I think I find out the way to fix, actually quite simple. Just add the "HOME" (System/getProperty "user.home")
in the clojure script, and the HOME not found is fix. You might check out this commit. I guess the problem is caused by ollama update in June or some newer version.
On Ubuntu 24.04 when i run: docker compose --profile linux up I get the following error:
pull-model-1 | pulling ollama model llama2 using http://host.docker.internal:11434 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:11 +0x13 pull-model-1 | panic: $HOME is not defined pull-model-1 | pull-model-1 | goroutine 1 [running]: pull-model-1 | github.com/ollama/ollama/envconfig.Models() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:93 +0xa9 pull-model-1 | github.com/ollama/ollama/envconfig.AsMap() pull-model-1 | github.com/ollama/ollama/envconfig/config.go:253 +0x699 pull-model-1 | github.com/ollama/ollama/cmd.NewCLI() pull-model-1 | github.com/ollama/ollama/cmd/cmd.go:1329 +0xb68 pull-model-1 | main.main() pull-model-1 | github.com/ollama/ollama/main.go:11 +0x13 database-1 | Installing Plugin 'apoc' from /var/lib/neo4j/labs/apoc-*-core.jar to /var/lib/neo4j/plugins/apoc.jar database-1 | Applying default values for plugin apoc to neo4j.conf pull-model-1 exited with code 1
** further down *** database-1 | 2024-08-05 10:42:14.659+0000 INFO Started. Gracefully stopping... (press Ctrl+C again to force) service "pull-model" didn't complete successfully: exit 1
it seems to be that my default fails as home doesn't exist: $HOME/.ollama/models