ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
10.67k stars 931 forks source link

Doesn't work with local-hosted Ollama on Linux #173

Closed 5a9awneh closed 1 week ago

5a9awneh commented 3 weeks ago

Describe the bug Interface slow to load, when it does, I receive Invalid connection message, in settings (also slow to load) it doesn't detect installed Ollama models

To Reproduce Steps to reproduce the behavior:

  1. Install Ollama locally (not as docker)
  2. Tried different Ollama APIs in config.toml: http://host.docker.internal:11434 http://localhost:11434 http://private_ip_of_computer_hosting_ollama:11434
  3. Install Perplexica via docker compose up -d
  4. Browse to http://localhost:3000 to See error
  5. Open Settings, doesn't detect neither Ollama nor Ollama models

Expected behavior It should be able to connect to Ollama and detect installed models

Additional context

ItzCrazyKns commented 1 week ago

You need to expose Ollama to your network and then use the private IP of the computer + port as the API URL. Feel free to re-open this issue or join our Discord server if you face any issues after performing the steps