Closed braindotai closed 6 days ago
Seems like Perplexica is not being able to connect to Ollama. What OS are you using?
Seems like Perplexica is not being able to connect to Ollama. What OS are you using?
I am using arch Linux. http://localhost:11434 is saying Ollama is running. I don't think os is the issue, I've downloaded other open source stuff like continue extension in vscode, that works just fine with my local ollama.
You need to follow the Ollama connection error guide in the readme below the installation.
On Mon, 24 Jun 2024, 15:58 Rishik Mourya, @.***> wrote:
Seems like Perplexica is not being able to connect to Ollama. What OS are you using?
I am using arch Linux. http://localhost:11434 is saying Ollama is running. I don't think os is the issue, I've downloaded other open source stuff like continue extension in vscode, that works just fine with my local ollama.
— Reply to this email directly, view it on GitHub https://github.com/ItzCrazyKns/Perplexica/issues/220#issuecomment-2186215954, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWY35HP3D76MB6V6QSP6LRTZI7YDLAVCNFSM6AAAAABJZKNIIOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBWGIYTKOJVGQ . You are receiving this because you commented.Message ID: @.***>
I've tried not. here's what I did.
$ ip addr show # copied the inet value from wlan0
OLLAMA = "http://<my_machine_ip>:11434"
$ sudo ufw allow 11434/tcp
$ OLLAMA_HOST=0.0.0.0 ollama serve
Still not working
perplexica-backend-1 | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1 | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1 | error: undefined
perplexica-backend-1 | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1 | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1 | error: undefined
Try sending a curl request to the same URL
All is good with the ollama url.
Oh wait, I needed to change the settings and select ollama manually. Got it working now!
Describe the bug Not working on fresh installed, steps followed from README.md
To Reproduce Steps to reproduce the behavior:
http://host.docker.internal:11434
as mentioned in the readme. Its my local ollama, both ollama and perpliexica is on same laptop.Expected behavior After docker compose up all should work fine, when entering a query it should ping my local ollama server and prepare and show the results.
Additional context Here are full logs: perplexica-docker-compose-up.txt