Open papiche opened 2 months ago
What Ollama address are you using? And what operating system are you on
OK, my fault... config.toml
[API_ENDPOINTS]
OLLAMA = "http://host.docker.internal:11434"
As ollama runs with systemd,
OLLAMA = "http://127.0.0.1:11434"
fix it
In fact. not. docker logs perplexica-perplexica-backend-1 still complains
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: undefined
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: undefined
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: undefined
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
ollama --version ollama version is 0.3.9
on Linux Mint 22 uname -a Linux sagittarius 6.8.0-41-generic 41-Ubuntu SMP PREEMPT_DYNAMIC Fri Aug 2 20:41:06 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
OK.
It happens when I updated ollama, it reset systemd and removed ENVIRONMENT parameters Environment="OLLAMA_HOST=0.0.0.0"
Thought it was that, but still not ;(
nfo: Server is running on port 3001
error: undefined
is there a way to get more logs from server ? still investigating...
Just change the LLM and embedding provider in the settings to Ollama that will fix your issue, if not just reopen it
This is my settings.
What surprise me is that I only have "open_ai" to choose from Chat model Provider
I suppose you are running Perplexica on docker and localhost for docker container will be for its network and not host's network so you need to use the private ip of the computer + port as the Ollama API URL
I am using Perplexica on a LAN GPU equiped computer. I used
perplexica-frontend:
build:
context: .
dockerfile: app.dockerfile
args:
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
as docker_compose parameters
then I use it on my host through "ssh tunnels"
echo "Perplexica"
ssh -fN -L 5000:127.0.0.1:3000 frd@gpu.local
echo "Perplexica API"
ssh -fN -L 3001:127.0.0.1:3001 frd@gpu.local
Maybe it could lead to websocket communication errors?
I should try to publish "LAN IP" for docker build ?
I succeed connecting to ollama with this parameters (OpenAI custom parameters)
But i get no answers in frontend
Still the same backend errors
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed
@ItzCrazyKns with the latest version of perplexica , getting the same error with OpenAI Models , so its has nothing to do with Ollama or Docker Networking
I succeed connecting to ollama with this parameters (OpenAI custom parameters)
But i get no answers in frontend
Still the same backend errors
error: Error loading Ollama models: TypeError: fetch failed error: Error loading Ollama embeddings model: TypeError: fetch failed error: Error loading Ollama models: TypeError: fetch failed error: Error loading Ollama embeddings model: TypeError: fetch failed
Because it doesn't actually connects, you need to use your private IP+ ports. Doesn't matter what provider you use, locakhost for the container will be for its networking and not yours.
As i am forwarding 3000 & 30001 ports on my machine, either 127.0.0.1 or localhost should work but i could break some "ws" connexions, as browser log indicate.
As Perplexica is running on IP 192.168.1.33 in my LAN I should try to change docker compose like this
perplexica-frontend: build: context: . dockerfile: app.dockerfile args: - NEXT_PUBLIC_API_URL=http://192.168.1.33:3001/api - NEXT_PUBLIC_WS_URL=ws://192.168.1.33:3001
and access to http://192.168.1.33:3000
NB: As there is no login "ssh reverse tunnels" were used to access Perplexica from remote location over the Internet.
Yes but only access it in the private IP. If you tunnel and try to access it from outside the network it won't work at all. Also you need to use private IP + port for Ollama and serve Ollama over the network.
Using LAN address (which is 192.168.1.27 5000/3001) in docker-compose.yaml, accessing without ssh tunnel
IT WORKS
Make sure Ollama is accessible in via your LAN
Is there any way to access Perplexica from WAN ?
There are many ways but we don't provide support for that.
OK. At least, it needs "websocket" relay for port 3001 do you plan to add user access control ?
There already exists a branch called admin-password which has password for confidential things like the settings. Consider closing this issue if your problem has been resolved
I'll try this other branch. Thx And so many thanks to you and the wonderful FOSS you made! You help AI to become a "common good"
It also happens to me regularly. Just wait in front of the prompt and after a while "Failed to connect to server" appears
In console, i can see a 404 when accessing https://perplexica.at.home/discover?_rsc=acgkz
In Network, last request is a "protocol switch" 101
GET /?chatModel=llama3.1%3Alatest&chatModelProvider=ollama&embeddingModel=llama3.1%3Alatest&embeddingModelProvider=ollama HTTP/1.1
Host: perplexicapi.at.home
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:130.0) Gecko/20100101 Firefox/130.0
Accept: */*
Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate, br, zstd
Sec-WebSocket-Version: 13
Origin: https://perplexica.at.home
Sec-WebSocket-Extensions: permessage-deflate
Sec-WebSocket-Key: l7/fyMHT/H8/0UOi4P2++w==
DNT: 1
Connection: keep-alive, Upgrade
Cookie: rl_session=RudderEncrypt%3AU2FsdGVkX18t7%2FREcOWO4PvwpX5dATkMGtZgxbaRUhywwx6hilB5AUcKpjschuZcjQjiDtyhx9FHiAjGLlrmnREuStYEwFd67XBQoV%2Fj%2BfHUKHFB4S4bvjD5waYBdyNliNv4CdD1KKOOGDnV4h0Wrg%3D%3D; rl_anonymous_id=RudderEncrypt%3AU2FsdGVkX1%2Bo7zdetmFDmGks%2B8QHRPrUL1K8pjsznAVrmNLtL6APBFOx7dtgDZU%2B4WUXlTHFg3tB5koWdWntLg%3D%3D; rl_page_init_referrer=RudderEncrypt%3AU2FsdGVkX19AlkgDwJi6wDXlFUzSPwh8NgQ8xgxPmCLyxM84t40a%2FuKqukjcPyK6; rl_page_init_referring_domain=RudderEncrypt%3AU2FsdGVkX1%2FFeJkEUcSmC48SDEszq%2FmELgtN5pT92rigRloFmG%2BEvRbXCSmx3Rz9; ph_phc_4URIAm1uYfJO7j8kWSe0J8lc8IqnstRLS7Jx8NcakHo_posthog=%7B%22distinct_id%22%3A%22f85477d4364bf27e078eae4790bed442db07a0421c6557dd2883173d195810a0%23af4ec013-fe45-4cad-8f20-d5ed0525a7ed%22%2C%22%24sesid%22%3A%5B1726536524590%2C%220191fd73-55df-7044-9555-1fd813bc64bf%22%2C1726534079967%5D%2C%22%24epp%22%3Atrue%7D
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: websocket
Sec-Fetch-Site: same-site
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
In case it helps investigating the problem
It also happens to me regularly. Just wait in front of the prompt and after a while "Failed to connect to server" appears
In console, i can see a 404 when accessing
https://perplexica.at.home/discover?_rsc=acgkz
In Network, last request is a "protocol switch" 101
GET /?chatModel=llama3.1%3Alatest&chatModelProvider=ollama&embeddingModel=llama3.1%3Alatest&embeddingModelProvider=ollama HTTP/1.1 Host: perplexicapi.at.home User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:130.0) Gecko/20100101 Firefox/130.0 Accept: */* Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3 Accept-Encoding: gzip, deflate, br, zstd Sec-WebSocket-Version: 13 Origin: https://perplexica.at.home Sec-WebSocket-Extensions: permessage-deflate Sec-WebSocket-Key: l7/fyMHT/H8/0UOi4P2++w== DNT: 1 Connection: keep-alive, Upgrade Cookie: rl_session=RudderEncrypt%3AU2FsdGVkX18t7%2FREcOWO4PvwpX5dATkMGtZgxbaRUhywwx6hilB5AUcKpjschuZcjQjiDtyhx9FHiAjGLlrmnREuStYEwFd67XBQoV%2Fj%2BfHUKHFB4S4bvjD5waYBdyNliNv4CdD1KKOOGDnV4h0Wrg%3D%3D; rl_anonymous_id=RudderEncrypt%3AU2FsdGVkX1%2Bo7zdetmFDmGks%2B8QHRPrUL1K8pjsznAVrmNLtL6APBFOx7dtgDZU%2B4WUXlTHFg3tB5koWdWntLg%3D%3D; rl_page_init_referrer=RudderEncrypt%3AU2FsdGVkX19AlkgDwJi6wDXlFUzSPwh8NgQ8xgxPmCLyxM84t40a%2FuKqukjcPyK6; rl_page_init_referring_domain=RudderEncrypt%3AU2FsdGVkX1%2FFeJkEUcSmC48SDEszq%2FmELgtN5pT92rigRloFmG%2BEvRbXCSmx3Rz9; ph_phc_4URIAm1uYfJO7j8kWSe0J8lc8IqnstRLS7Jx8NcakHo_posthog=%7B%22distinct_id%22%3A%22f85477d4364bf27e078eae4790bed442db07a0421c6557dd2883173d195810a0%23af4ec013-fe45-4cad-8f20-d5ed0525a7ed%22%2C%22%24sesid%22%3A%5B1726536524590%2C%220191fd73-55df-7044-9555-1fd813bc64bf%22%2C1726534079967%5D%2C%22%24epp%22%3Atrue%7D Sec-Fetch-Dest: empty Sec-Fetch-Mode: websocket Sec-Fetch-Site: same-site Pragma: no-cache Cache-Control: no-cache Upgrade: websocket
In case it helps investigating the problem
Just refresh the page and see if that works, I generally don't provide support for LAN hosted or Network hosted versions because most of the times its just your network.
you are right, it could be network quality, and it can happen in LAN (wifi with room mates condition) so maybe having some retry instead of direct timeout could help. from what part of the code this error is raised ? I could have a look
fwiw, I ran into a similar error and what fixed it for me was changing the base image of node
that runs from within backend.dockerfile
. Essentially changing it to node
from node:slim
resolved my issue
fwiw, I ran into a similar error and what fixed it for me was changing the base image of
node
that runs from withinbackend.dockerfile
. Essentially changing it tonode
fromnode:slim
resolved my issue
same error, specify, please~
I read the whole thread and had tried to fix it.
I got it working after setting everything on LAN device IP with ollama port and net.ipv4.ip_forward=1
in sysctl
and docker-compose.yaml
3 times.
I forgot i had a Firewall active, so i made a rule for ollama be allowed to listen to network. Now its working.
If i made it somewhere wrong, you can inform me but i am glad it is working now because i was the whole day working on it.
fwiw, I ran into a similar error and what fixed it for me was changing the base image of
node
that runs from withinbackend.dockerfile
. Essentially changing it tonode
fromnode:slim
resolved my issue
Do you mean by changing "backend.dockerfile" into
FROM node:18
WORKDIR /home/perplexica
COPY src /home/perplexica/src
COPY tsconfig.json /home/perplexica/
COPY drizzle.config.ts /home/perplexica/
COPY package.json /home/perplexica/
COPY yarn.lock /home/perplexica/
RUN mkdir /home/perplexica/data
RUN yarn install --frozen-lockfile --network-timeout 600000
RUN yarn build
CMD ["yarn", "start"]
Could you confirm it solves "Failed to connect to server" ??
In latest code version,
when this issue is fired up, my browser console indicates :
[DEBUG] closed
while network failed loading
https://perplexica.mydomain.tld/?_rsc=acgkz
https://perplexica.mydomain.tld/discover?_rsc=acgkz
It also happens during "video" playback with the same [DEBUG] closed
Describe the bug
After actual code update
I cannot use perplexica.
Opening client displays "Failed to connect to the server. Please try again later."
Additional context