-
I want to change their ports to 3010 and 3011 respectively, how should I do it? I have made changes to the corresponding parts of the docker-compose.yaml and config.toml files, but the web page still …
-
![image](https://github.com/ItzCrazyKns/Perplexica/assets/25132014/e3e18989-3afb-4c66-8b4a-c2fe9af87669)
Windows PowerShell
版权所有(C) Microsoft Corporation。保留所有权利。
安装最新的 PowerShell,了解新功能和改进!htt…
-
**Describe the bug**
Running yarn build for the perplexica-frontend project fails with a type error in the EmptyChat.tsx component. The error message indicates that the size property passed to the Th…
-
**Describe the bug**
try installin the product on windows
**Screenshots**
![image](https://github.com/ItzCrazyKns/Perplexica/assets/93899908/83160d6f-0426-45b4-af88-05fdc3972e65)
**Additiona…
-
**Is your feature request related to a problem? Please describe.**
Accessing Perplexica at least within LAN should be allowed
**Describe the solution you'd like**
Solving the "Uncaught (in promis…
-
**Describe the bug**
Build fails in Windows WSL with default settings
**To Reproduce**
Steps to reproduce the behavior:
clone the repo in WSL (ubuntu server 22.04) edit the config and only add…
-
@ItzCrazyKns none of these LLM APIs providers for ollama LLMs or Azure OpenAI LLMs or embeddings models or any other LLMs don't work at all despite those changes per your docs cited there.
_Originall…
-
As title, I hope to be able to run this service on my Ubuntu server and access it using a domain name, but my attempts have failed. I have carefully read through Networking.md, but it has not been eff…
-
**Describe the bug**
Interface slow to load, when it does, I receive `Invalid connection` message, in settings (also slow to load) it doesn't detect installed Ollama models
**To Reproduce**
Steps…
-
**Is your feature request related to a problem? Please describe.**
No problem, but quality of life. When a response was generated, it would be really nice to just be able to hit "/" to select the tex…