Open zer0u1tra opened 1 month ago
Here is the Verba output when trying to embed a .pdf document:
(INFO) Importing...
(INFO) Importing 1 files with BasicReader
(INFO) Importing Lanier.pdf
(SUCCESS) Loaded 1 documents in 0.46s
(INFO) Starting Chunking with TokenChunker
(SUCCESS) Chunking completed with 232 chunks in 0.03s
(INFO) Starting Embedding with OllamaEmbedder
(ERROR) HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x72f4537f4310>: Failed to establish a new connection: [Errno 111] Connection refused'))
I am having the same issue. I am not sure what the issue is ... but I have several hours into this with no progress.
Your Docker Container can't access localhost outside of it's container, you need to specify the OLLAMA_URL to http://host.docker.internal:11434
I am not using a docker container. I am running Ollama native Windows, and Verba in WSL. Maybe that is an issue? I will check if I can hit Ollama from the WSL window.
Your Docker Container can't access localhost outside of it's container, you need to specify the OLLAMA_URL to
http://host.docker.internal:11434
I started with that and got the same error, just coming from host.docker.internal
. What I posted is what I tried after the fact.
Anyone else have any ideas?
I think my situation with WSL is similar - it's running behind Hyper-V and by default, port forwarding is not enabled from WSL -> Windows but it IS the other way. However, even after enabling port forwarding, it still cannot connect.
I think my situation with WSL is similar - it's running behind Hyper-V and by default, port forwarding is not enabled from WSL -> Windows but it IS the other way. However, even after enabling port forwarding, it still cannot connect.
Yeah I have been running it via the docker-compose. It works via the pip install method, but I'd like to also use the weaviate integration that already comes with the docker setup.
Hi Try hardcoding these two variables in the docker file. OLLAMA_URL=http://localhost:11434 OLLAMA_MODEL=llama3 And also verify variables in the admin screen in UI before importing.
Hi Try hardcoding these two variables in the docker file. OLLAMA_URL=http://localhost:11434 OLLAMA_MODEL=llama3 And also verify variables in the admin screen in UI before importing.
This also does not work.
I'm also having this issue. I'm using docker to spin up container:
docker compose --env-file .env up -d
and my .env
is:
OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=llama3
After a couple of hours tinkering, I found a solution for those running Ollama and Verba locally on a Windows machine. You can see it here https://www.robotstud.io/how-to-run-verba-and-ollama-locally-on-your-windows-machine/
This is a great resource, let me know if this helps to fix the issue
We are getting this error when using the following docker-compose YAML:
And the following .env:
Has anyone else been getting this / have a workaround?