Open t-dsai opened 6 months ago
@t-dsai, I'm having the same issue , please tell me how did you resolved it
Same issue here.
me neither
Same issue
Please fix, multiple people are having this issue
For me it was a proxy issue, I solved it by setting the proxy describes here in my settings.yml: https://docs.searxng.org/admin/settings/settings_outgoing.html
outgoing: proxies: all://:
Hi,
Thank you for sharing your work.
On a Debian 12 machine, with docker running, I already have Ollama, with all-minilm, nomic-embed-text, and spooknik/hermes-2-pro-mistral-7b models available. In my
.env
file, I've only one lineOLLAMA_HOST=http://localhost:11434
.I can see from
docker ps -a
that following containers are present (in the exited state afterctrl+c
)0e363e666648 nilsherzig/llocalsearch-frontend:latest 2d415fe8d659 chromadb/chroma 40dfd8260b28 searxng/searxng:latest a49ae53eaf6f redis:alpine 7bb5fd2c08d1 nilsherzig/llocalsearch-backend:latest
I cloned the LLocalSearch repo. When I run
docker compose up
, I get the following errors (and many more similar ones after these). Can anyone help in successfully running LLocalSearch?Please note that my system does not have
anyio, httpcore, and httpx
in the directory/usr/lib/python3.11
. I do have a few Python environments, where these packages are installed.If these packages are necessary to run LLocalSearch, can LLocalSearch use a custom Python environment to search these packages?
Thanks in advance.