ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
10.67k stars 931 forks source link

Not working on Fresh install - tested on 2 different systems. #220

Closed braindotai closed 6 days ago

braindotai commented 1 week ago

Describe the bug Not working on fresh installed, steps followed from README.md

To Reproduce Steps to reproduce the behavior:

  1. git clone
  2. Chang ollama url to http://host.docker.internal:11434 as mentioned in the readme. Its my local ollama, both ollama and perpliexica is on same laptop.
  3. docker compose up .... starts throwing error
  4. On entering on query on localhost:3000 does nothing.

Expected behavior After docker compose up all should work fine, when entering a query it should ping my local ollama server and prepare and show the results.

Additional context Here are full logs: perplexica-docker-compose-up.txt

ItzCrazyKns commented 1 week ago

Seems like Perplexica is not being able to connect to Ollama. What OS are you using?

braindotai commented 1 week ago

Seems like Perplexica is not being able to connect to Ollama. What OS are you using?

I am using arch Linux. http://localhost:11434 is saying Ollama is running. I don't think os is the issue, I've downloaded other open source stuff like continue extension in vscode, that works just fine with my local ollama.

ItzCrazyKns commented 1 week ago

You need to follow the Ollama connection error guide in the readme below the installation.

On Mon, 24 Jun 2024, 15:58 Rishik Mourya, @.***> wrote:

Seems like Perplexica is not being able to connect to Ollama. What OS are you using?

I am using arch Linux. http://localhost:11434 is saying Ollama is running. I don't think os is the issue, I've downloaded other open source stuff like continue extension in vscode, that works just fine with my local ollama.

— Reply to this email directly, view it on GitHub https://github.com/ItzCrazyKns/Perplexica/issues/220#issuecomment-2186215954, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWY35HP3D76MB6V6QSP6LRTZI7YDLAVCNFSM6AAAAABJZKNIIOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBWGIYTKOJVGQ . You are receiving this because you commented.Message ID: @.***>

braindotai commented 1 week ago

I've tried not. here's what I did.

  1. $ ip addr show # copied the inet value from wlan0
  2. Changed ollama to this OLLAMA = "http://<my_machine_ip>:11434"
  3. $ sudo ufw allow 11434/tcp
  4. Stopped Ollama and restarted it with $ OLLAMA_HOST=0.0.0.0 ollama serve

Still not working

perplexica-backend-1   | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1   | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1   | error: undefined
perplexica-backend-1   | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1   | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1   | error: undefined
ItzCrazyKns commented 6 days ago

Try sending a curl request to the same URL

braindotai commented 6 days ago

Screenshot from 2024-06-25 15-47-53

All is good with the ollama url.

braindotai commented 6 days ago

Oh wait, I needed to change the settings and select ollama manually. Got it working now!