-
I'm looking for a simple way to change Perplexica's default language, meaning I'd like to receive direct responses in French without having to rephrase the question.
The simplest solution would be t…
-
The parent repository for **[sheeeng/itzcrazykns-perplexica](https://github.com/sheeeng/itzcrazykns-perplexica)** has updates available.
### Important!
Click on this [compare link](https://github.c…
-
**Describe the bug**
I used the docker method try to build Perplexica, however everytime the build failed at backend building step 11 yarn install part, as showing below:
![image](https://github.com…
-
There are plenty of amazing solutions for using large language models (LLMs) to help with searching. For sake of compressing this request, I'll point out four kinds of them that I want in a modern sea…
-
**Describe the bug**
install latest 1.7
docker compose up -d:
Trying to pull docker.io/library/perplexica_perplexica-backend:latest...
Error: initializing source docker://perplexica_perplexica-bac…
-
**Describe the bug**
deploying on repocloud asks for OPEN_API_KEY which I provided but I am not on billing
want to use GROQ but:
I set GROQ_API_KEY in settings
I set GROQ_API_KEY in env variables
…
-
**Describe the bug**
I am unable to access the perplexica behind a nginx reverse proxy. The frontend loads, but not the backend (infinite loading). The browser produces the errors:
```
Blocked lo…
-
**Describe the bug**
When I use the VLM-GGUF Loader node with the Local Large Language Model Node, if VLM-GGUF model type is selected, I get this error: "Prompt exceeds n_ctx: 2913 > 1024". This erro…
-
Independently from the input language Perplexica most of the time replies in English.
Could you please introduce "preferred output language" in the settings? With prompting like "Please write your…
-
**Describe the bug**
Frontend shows "No chat models available" error. Upon inspections in the browser, I find that get request to `http://127.0.0.1:3001/api/model` has 403 error (see screenshots)
…