ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
16.27k stars 1.52k forks source link

Frontend cannot access model list from the backend #283

Closed nguyentran0212 closed 3 months ago

nguyentran0212 commented 4 months ago

Describe the bug Frontend shows "No chat models available" error. Upon inspections in the browser, I find that get request to http://127.0.0.1:3001/api/model has 403 error (see screenshots)

To Reproduce Steps to reproduce the behavior:

  1. Install Perplexica in docker according to the instruction in README.
  2. Open localhost:3000

Expected behavior Perplexica is usable.

Screenshots

403 error from front-end:

Screenshot 2024-07-24 at 5 18 08 pm

Response from api/model at 127.0.0.1:3001 (Ollama model was found correctly)

Screenshot 2024-07-24 at 5 20 57 pm

Additional context

Running on MacOS.

ItzCrazyKns commented 4 months ago

What can you see at http://127.0.0.1:3001

nguyentran0212 commented 4 months ago

@ItzCrazyKns at 127.0.0.1:3001, I see Cannot GET /

ItzCrazyKns commented 4 months ago

@ItzCrazyKns at 127.0.0.1:3001, I see Cannot GET /

http://127.0.0.1:3001/api/models

ItzCrazyKns commented 4 months ago

@ItzCrazyKns I also encountered the same problem image and backend logs, is it a problem with my proxy or the openai key

image

Are you using custom OpenAI or just normal OpenAI provider? The error is unrelated to Perplexica, seems with an issue with your configuration

nguyentran0212 commented 4 months ago

@ItzCrazyKns at 127.0.0.1:3001, I see Cannot GET /

http://127.0.0.1:3001/api/models

Screenshot 2024-07-24 at 5 18 15 pm

I use Ollama. The backend gets the correct model from my Ollama, but it seems that front-end cannot communicate with backend via 127.0.0.1.

I tested with both safari and chrome.

My device is MacBook Pro M1 running on MacOS 14.3.1.

Firewall is disabled on my MacBook.

ItzCrazyKns commented 4 months ago

@ItzCrazyKns at 127.0.0.1:3001, I see Cannot GET /

http://127.0.0.1:3001/api/models

Screenshot 2024-07-24 at 5 18 15 pm

I use Ollama. The backend gets the correct model from my Ollama, but it seems that front-end cannot communicate with backend via 127.0.0.1.

I tested with both safari and chrome.

My device is MacBook Pro M1 running on MacOS 14.3.1.

Firewall is disabled on my MacBook.

That's strange can you try re-install Perplexica or provide me with the exact steps so I can try to reproduce?

ItzCrazyKns commented 4 months ago
image

Boss, may I ask a question? Due to network restrictions on my server, I am unable to access SearxNG. I would like to perform front-end and back-end separation. I understand that we are divided into four parts: front-end and back-end, SearxNG, and model services. I understand that as long as SearxNG has a VPN network, it can be used normally. Now I have two environments. Mac can use perlexity to search normally, but my Linux is not working. The above is my configuration. My Olama saw that there was a call, but the backend logs @ItzCrazyKns image

Can you try sending doing a curl request to the same address that you're using for Ollama in the same computer as Perplexica.

curl <address>
nguyentran0212 commented 3 months ago

@ItzCrazyKns Hi, sorry for the late response. I pulled the latest version from GitHub 5 minutes ago and run docker compose up -d again. This time it works. There were 4 new commits pulled, so I guess some of them did the trick.