ItzCrazyKns / Perplexica

Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
MIT License
13.34k stars 1.26k forks source link

project is broken on linux #281

Open mlario opened 1 month ago

mlario commented 1 month ago

Describe the bug Several last updates made perplexica unusable on linux. I only get "Failed to connect to server. Please try again later" after doing search

With earlier version it was running though. Backend log is in attachments backend_log.txt

To Reproduce I run perplexica as docker on local vanilla Fedora linux. My ollama is run as systemd service. Searxng is run from a separate docker on port 4000, config of searxng includes json option

A month ago I had to do several modifications to make perplexica work even with old versions of perplexica due to communication error from docker to ollama. Here are the configs:

app.dockerfile.txt config.toml.txt docker-compose.yaml.txt

the most important was using host network

here are docker run commands I use:

docker run --restart always --name perplexica-backend -e SEARXNG_API_URL=http://127.0.0.1:4000 --network=host -p 3001:3001 perplexica-perplexica-backend

docker run -d --restart always --name perplexica-frontend --network=host -p 3000:3000 perplexica-perplexica-frontend

To make it clear, I again retried setting up perplexica using standard guidlines in the github and I do not get communication to ollama even. With my setup I can see ollama models in perplexicty settings but error "Failed to connect to server. Please try again later" persists

ItzCrazyKns commented 1 month ago

Describe the bug Several last updates made perplexica unusable on linux. I only get "Failed to connect to server. Please try again later" after doing search

With earlier version it was running though. Backend log is in attachments backend_log.txt

To Reproduce I run perplexica as docker on local vanilla Fedora linux. My ollama is run as systemd service. Searxng is run from a separate docker on port 4000, config of searxng includes json option

A month ago I had to do several modifications to make perplexica work even with old versions of perplexica due to communication error from docker to ollama. Here are the configs:

app.dockerfile.txt config.toml.txt docker-compose.yaml.txt

the most important was using host network

here are docker run commands I use:

docker run --restart always --name perplexica-backend -e SEARXNG_API_URL=http://127.0.0.1:4000 --network=host -p 3001:3001 perplexica-perplexica-backend

docker run -d --restart always --name perplexica-frontend --network=host -p 3000:3000 perplexica-perplexica-frontend

To make it clear, I again retried setting up perplexica using standard guidlines in the github and I do not get communication to ollama even. With my setup I can see ollama models in perplexicty settings but error "Failed to connect to server. Please try again later" persists

Why are you using host network? Follow the installation steps exactly and then follow this https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors

mlario commented 1 month ago

Describe the bug Several last updates made perplexica unusable on linux. I only get "Failed to connect to server. Please try again later" after doing search With earlier version it was running though. Backend log is in attachments backend_log.txt To Reproduce I run perplexica as docker on local vanilla Fedora linux. My ollama is run as systemd service. Searxng is run from a separate docker on port 4000, config of searxng includes json option A month ago I had to do several modifications to make perplexica work even with old versions of perplexica due to communication error from docker to ollama. Here are the configs: app.dockerfile.txt config.toml.txt docker-compose.yaml.txt the most important was using host network here are docker run commands I use: docker run --restart always --name perplexica-backend -e SEARXNG_API_URL=http://127.0.0.1:4000 --network=host -p 3001:3001 perplexica-perplexica-backend docker run -d --restart always --name perplexica-frontend --network=host -p 3000:3000 perplexica-perplexica-frontend To make it clear, I again retried setting up perplexica using standard guidlines in the github and I do not get communication to ollama even. With my setup I can see ollama models in perplexicty settings but error "Failed to connect to server. Please try again later" persists

Why are you using host network? Follow the installation steps exactly and then follow this https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors

Well I tried following standard installation process and then following guide for connection error on linux.

As I mentioned before it failed and I could not get connection to ollama from perplexica - no models were listed in the settings

I use host network because this solution works, it is know issue connecting from docker containers to ollama on linux and was solved before with host network as described in openwebui: https://github.com/open-webui/open-webui?tab=readme-ov-file#open-webui-server-connection-error

Could you please give some hint regarding my backend log file, I think troubleshooting it will give me ability to solve the problem. I suspect there is an issue with communication to searxng: https://github.com/user-attachments/files/16354344/backend_log.txt

Thanks

ItzCrazyKns commented 1 month ago

Describe the bug Several last updates made perplexica unusable on linux. I only get "Failed to connect to server. Please try again later" after doing search With earlier version it was running though. Backend log is in attachments backend_log.txt To Reproduce I run perplexica as docker on local vanilla Fedora linux. My ollama is run as systemd service. Searxng is run from a separate docker on port 4000, config of searxng includes json option A month ago I had to do several modifications to make perplexica work even with old versions of perplexica due to communication error from docker to ollama. Here are the configs: app.dockerfile.txt config.toml.txt docker-compose.yaml.txt the most important was using host network here are docker run commands I use: docker run --restart always --name perplexica-backend -e SEARXNG_API_URL=http://127.0.0.1:4000 --network=host -p 3001:3001 perplexica-perplexica-backend docker run -d --restart always --name perplexica-frontend --network=host -p 3000:3000 perplexica-perplexica-frontend To make it clear, I again retried setting up perplexica using standard guidlines in the github and I do not get communication to ollama even. With my setup I can see ollama models in perplexicty settings but error "Failed to connect to server. Please try again later" persists

Why are you using host network? Follow the installation steps exactly and then follow this https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors

Well I tried following standard installation process and then following guide for connection error on linux.

As I mentioned before it failed and I could not get connection to ollama from perplexica - no models were listed in the settings

I use host network because this solution works, it is know issue connecting from docker containers to ollama on linux and was solved before with host network as described in openwebui: https://github.com/open-webui/open-webui?tab=readme-ov-file#open-webui-server-connection-error

Could you please give some hint regarding my backend log file, I think troubleshooting it will give me ability to solve the problem. I suspect there is an issue with communication to searxng: https://github.com/user-attachments/files/16354344/backend_log.txt

Thanks

If you use host network then the inter-container communication (over the docker network) wouldn't take place. So the address http://searxng:8080 doesn't exists on your actual host device.

ProjectMoon commented 1 month ago

If a container runs in host networking mode, then it's like a service running on localhost. It has no access to Docker container networks. It only sees what's running on localhost. So to connect to searxng it will have to connect to whatever port searxng is using on localhost. And if the port isn't exposed, it needs to be exposed.

hugokoopmans commented 1 month ago

the docker-compose.yaml also does not work OOTB

I have nothing running on port 4000 but

docker compose up -d [+] Running 2/3 ⠿ Container perplexica-searxng-1 Starting 0.2s ✔ Container perplexica-perplexica-backend-1 Created 0.0s ✔ Container perplexica-perplexica-frontend-1 Created 0.1s Error response from daemon: driver failed programming external connectivity on endpoint perplexica-searxng-1 (acb06cb3e02dacad375d5ce175ad07fcfa8a8ca4f5dd6f762ceb52c0c52e4037): failed to bind port 0.0.0.0:4000/tcp: Error starting userland proxy: listen tcp4 0.0.0.0:4000: bind: address already in use

ItzCrazyKns commented 1 month ago

Try sending a curl request to localhost:4000

hugokoopmans commented 1 month ago

sorry my bad, appeared some othger service is there, moved port now it works OOTB

mlario commented 1 month ago

http://searxng:8080

corrected this to 127.0.0.1:8080 and it worked, thanks!