nilsherzig / LLocalSearch

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Apache License 2.0
5.67k stars 362 forks source link

minimal instructions on how to setup ollama for this project? #34

Closed zaggynl closed 7 months ago

zaggynl commented 7 months ago

I installed ollama, downloaded the all-minilm:v2 model as is mentioned in the error message from this project: Model all-minilm:v2 does not exist and could not be pulled: Post "http://127.0.0.1:11434/api/pull": dial tcp 127.0.0.1:11434: connect: connection refused

curl http://127.0.0.1:11434 shows: Ollama is running

ollama list NAME ID SIZE MODIFIED
all-minilm:v2 1b226e2802db 45 MB 11 minutes ago

nilsherzig commented 7 months ago

hi could you please try this as a domain? (just run git pull, to get the current configs).

https://github.com/nilsherzig/LLocalSearch/blob/a422c14c94be5bbc7b7b9da56c3d155fd7e0d555/docker-compose.yaml#L7-L20

Localhost (127.0.0.1) on the container is not the localhost of your host system (where ollama is probably running)

Trisert commented 7 months ago

Hi, i'm having the same problem runnin in NixOS. I'm using the latest pull from the repo and the same docker-compose.yaml that use the the extra_hosts config but it seems to still not use the same localhost for docker. My docker-compose version is 2.23.1

nilsherzig commented 7 months ago

Hi, i'm having the same problem runnin in NixOS. I'm using the latest pull from the repo and the same docker-compose.yaml that use the the extra_hosts config but it seems to still not use the same localhost for docker. My docker-compose version is 2.23.1

Is your ollama instance listening on the right network interface?

Trisert commented 7 months ago

Hi, i'm having the same problem runnin in NixOS. I'm using the latest pull from the repo and the same docker-compose.yaml that use the the extra_hosts config but it seems to still not use the same localhost for docker. My docker-compose version is 2.23.1

Is your ollama instance listening on the right network interface?

I've tried starting ollama by running ollama serve and also by OLLAMA_HOST=0.0.0.0 ollama serve. Since I have not used Ollama much it is likely that I am doing it wrong. If you can suggest where I might be going wrong in initializing the server I might try a different configuration.

kkokosa commented 7 months ago

I needed to pull all-minilm explictly, although all-minilm:v2 was already pulled by the initial script.

mike-luabase commented 7 months ago

@kkokosa that fixed it for me

ollama pull all-minilm
zaggynl commented 7 months ago

hi could you please try this as a domain? (just run git pull, to get the current configs).

https://github.com/nilsherzig/LLocalSearch/blob/a422c14c94be5bbc7b7b9da56c3d155fd7e0d555/docker-compose.yaml#L7-L20

Localhost (127.0.0.1) on the container is not the localhost of your host system (where ollama is probably running)

Thanks for the reply, I've tried both:

      - OLLAMA_HOST=http://host.docker.internal:11434
#       - OLLAMA_HOST=http://127.0.0.1:11434

OLLAMA_HOST=0.0.0.0 ollama serve seems to have done the trick as a workaround, I'll add this to the systemd file

/etc/systemd/system/ollama.service added Environment="OLLAMA_HOST=0.0.0.0" below: Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games"

sudo systemctl daemon-reload sudo service ollama restart

nilsherzig commented 7 months ago

I needed to pull all-minilm explictly, although all-minilm:v2 was already pulled by the initial script.

oh i forgot to make a new docker release after committing https://github.com/nilsherzig/LLocalSearch/commit/50646ebfccb05e0b3daa12ba17fc622f3810b1d7 I'm going to automate this in the future. New containers are now live

nilsherzig commented 7 months ago

hi could you please try this as a domain? (just run git pull, to get the current configs). https://github.com/nilsherzig/LLocalSearch/blob/a422c14c94be5bbc7b7b9da56c3d155fd7e0d555/docker-compose.yaml#L7-L20

Localhost (127.0.0.1) on the container is not the localhost of your host system (where ollama is probably running)

Thanks for the reply, I've tried both:

      - OLLAMA_HOST=http://host.docker.internal:11434
#       - OLLAMA_HOST=http://127.0.0.1:11434

OLLAMA_HOST=0.0.0.0 ollama serve seems to have done the trick as a workaround, I'll add this to the systemd file

/etc/systemd/system/ollama.service added Environment="OLLAMA_HOST=0.0.0.0" below: Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games"

sudo systemctl daemon-reload sudo service ollama restart

I think I should change the docker networking. The whole localhost story is a bit too confusing for people who run Ollama on the same machine. Having a private network between the containers is nice in theory tho

blakkd commented 2 months ago

hi could you please try this as a domain? (just run git pull, to get the current configs).

https://github.com/nilsherzig/LLocalSearch/blob/a422c14c94be5bbc7b7b9da56c3d155fd7e0d555/docker-compose.yaml#L7-L20

Localhost (127.0.0.1) on the container is not the localhost of your host system (where ollama is probably running)

Saved me! I think this OLLAMA_HOST=http://host.docker.internal:11434 trick should be pinned somewhere.