Open andreascschmidt opened 1 month ago
What Ollama address are you using?
in the docker compose
ollama: volumes:
and in the config file OLLAMA = "http://host.docker.internal:11434/" I also tried localhost
I've checked with netstat the port is open
Confirm! I have the exact same error.
Ollama is running on another host in the network in my case. So i set the IP for that host like this:
OLLAMA = "http://172.16.17.28:11434" # Ollama API URL - http://host.docker.internal:11434
I recieve a "Ollama is running", if i type the set Url with Port in the browser.
Any (debug) log that i can provide or generate ? Thanks for any help
The cause for this issue for me was:
args:
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
I was deploying the stack on a remote server and using Tailscale to make the connections.
Building the front end with the following args solved the issue:
args:
- NEXT_PUBLIC_API_URL=http://host-name.ts-subdomain.ts.net:3001/api
- NEXT_PUBLIC_WS_URL=ws://host-name.ts-subdomain.ts.net:3001
I don't have the logs anymore. Inspecting the traffic in the browser, I noticed that UI was hanging up because a call was being made to http://127.0.0.1:3001/api/models
.
Hope this helps.
in the docker compose
ollama: volumes: - ./ollama:/root/.ollama ports: - 11434:11434 container_name: ollama image: ollama/ollama
and in the config file OLLAMA = "http://host.docker.internal:11434/" I also tried localhost
I've checked with netstat the port is open
Since you're running Ollama with the other images, you can just load it over the Perplexica network and then access it by http://ollama:11434
.
try pulling llama3 i think then restart, it should work.
I had the same issue using ollama and searxng on 2 other docker stack. Then I decided to build the images locally (perplexica-backend:main and perplexica-frontend:main) instead of pulling from docker.io - then it worked... maybe there is an issue with the image on dockerhub...
There is no issue with images on the Docker hub, they are hardcoded to local reference since when we built images, the next public vars gets bundled in the javascript so there is no way we could change it. I've mentioned this in the update guide as well, so you need to build your own images if you wish to use some other IP than localhost. There's nothing I could do.
I encountered the same issue because I installed Skybox, which is also using port 3001. The perplexica calling model is also using port 3001, which conflicts
@ItzCrazyKns I haven't been able to get this going and there are few more opening new issues about what sounds the same.
https://github.com/ItzCrazyKns/Perplexica/issues/437 https://github.com/ItzCrazyKns/Perplexica/issues/467
Would you mind having a dummy foolpropof step-by-step guide with included ollama or maybe even better, openwebui, assuming you have it running well?
Describe the bug under locahost:3000 I'm only getting a spinning wheel but no search
To Reproduce I used docker-compose.yml including ollama which I linked in the config.toml
config.toml OLLAMA = "http://host.docker.internal:11434"