ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
14.19k stars 1.37k forks source link

"Failed to connect to the server. Please try again later." #419

Open coderyiyang opened 4 days ago

coderyiyang commented 4 days ago

Describe the bug When I followed Perplexica's instructions to complete the installation within Docker and set the Ollama API URL to http://host.docker.internal:11434 on the setup page, then saved and returned to the main page, it showed "Failed to connect to the server. Please try again later." As shown in the screenshot below.

I have also tried changing http://host.docker.internal:11434 to http://192.168.x.x:11434, but the problem persists. I have tried other similar solutions from the issue page, but the problem still exists.

About a month ago in the previous version, this problem did not exist.

env: windows 10 ver22H2 sub env: Docker version 27.2.0, build 3ab4256 ver of perplexica: v1.9.1

Expected behavior debug and explain why this happened. step-by-step is expected.

Screenshots image image

Additional context Add any other context about the problem here.

ItzCrazyKns commented 4 days ago

Is Ollama accessible at http://localhost:11434?

coderyiyang commented 4 days ago

Yes, ollama works just fine, and I tried http://localhost:11434 too, but failed.

该邮件从移动设备发送

--------------原始邮件-------------- 发件人:"ItzCrazyKns @.>; 发送时间:2024年10月20日(星期天) 晚上10:37 收件人:"ItzCrazyKns/Perplexica" @.>; 抄送:"coderyiyang @.>;"Author @.>; 主题:Re: [ItzCrazyKns/Perplexica] "Failed to connect to the server. Please try again later." (Issue #419)

Is Ollama accessible at http://localhost:11434?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

ItzCrazyKns commented 3 days ago

Use the http://host.docker.internal:11434 URL and select Ollama in the chat model provider select. If you don't see it there, I need the logs from the backend.

coderyiyang commented 3 days ago

Use the http://host.docker.internal:11434 URL and select Ollama in the chat model provider select. If you don't see it there, I need the logs from the backend.

When I use http://host.docker.internal:11434 as the URL, there is no option other than ‘Custom_openai’ in the Chat model Provider list. see sreenshot: image

I‘m not sure if 'the logs from the backend' means those logs generated by Docker, if so, here are the files I retrieved from it: electron-2024-10-21-20.log monitor.log com.docker.backend.exe.log

If you mean similar log files generated by Perplexica, tell me where they are. I tried to browse its structure but find no .log files. Thanks a lot.

goughjo02 commented 3 days ago

did you press the blue button after you updated the config?

coderyiyang commented 3 days ago

did you press the blue button after you updated the config?

ya, sure, or there would have no log file to generate.

goughjo02 commented 3 days ago

i had the same bug as you but i got it to stop. could you try to disable the cache in your network tab of your browser maybe? it is a shot in the dark but just see what happens

coderyiyang commented 3 days ago

i had the same bug as you but i got it to stop. could you try to disable the cache in your network tab of your browser maybe? it is a shot in the dark but just see what happens

Thanks for the tip, I'll give it a shot later!

coderyiyang commented 2 days ago

i had the same bug as you but i got it to stop. could you try to disable the cache in your network tab of your browser maybe? it is a shot in the dark but just see what happens

Nope! doesn't work for me, but thank you all the same~