Closed liujvnes closed 5 months ago
What are the logs from the console and the backend container? Are you able to perform a search?
控制台和后端容器中的日志是什么?您能进行搜索吗? I can search, but I don't know in which file
控制台和后端容器中的日志是什么?您能进行搜索吗? I can search, but I don't know in which file
If you're able to search then refresh your page and wait till the loading finishes, then try opening the settings.
控制台和后端容器中的日志是什么?您能进行搜索吗? 我可以搜索,但我不知道在哪个文件中
如果您能够搜索,请刷新页面并等待加载完成,然后尝试打开设置。
Searching in Perplexica has been waiting, but I don't know how to change it to use it!
控制台和后端容器中的日志是什么?您能进行搜索吗? 我可以搜索,但我不知道在哪个文件中
如果您能够搜索,请刷新页面并等待加载完成,然后尝试打开设置。
Searching in Perplexica has been waiting, but I don't know how to change it to use it!
Can you show the logs from the backend container of docker?
控制台和后端容器中的日志是什么?您能进行搜索吗? 我可以搜索,但我不知道在哪个文件中
如果您能够搜索,请刷新页面并等待加载完成,然后尝试打开设置。
在 Perplexica 中搜索一直在等待,但我不知道如何更改它以使用它!
你能显示来自docker后端容器的日志吗?
I don't have docker installed
So show the logs from the terminal where you're running Perplexica from
Can you try refreshing your page and sending something in english
你能尝试刷新你的页面并用英语发送一些东西吗?
Messages can be sent in English, but the result is pending
你能尝试刷新你的页面并用英语发送一些东西吗?
Messages can be sent in English, but the result is pending
Can you open up the application tab in the developers menu and show what values are there
Have you pulled any models from Ollama? Show what are your values in the localStorage
Can you teach me where I can set the localStorage value? I don't know where to set it!
Can you teach me where I can set the localStorage value? I don't know where to set it!
Hi, unfortunately I cannot guide you on that. You can google up how you can see the localStorage values and it would guide you better than me. Have you pulled any model from Ollama?
There are models in Ollama
I may have the same issue., not able to select chat model provider and embedding model provider
I may have the same issue., not able to select chat model provider and embedding model provider
Seems like a connection error if running it on Docker. Follow the steps listed here to get it fixed https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors
Hi @liujvnes, I released a new version of Perplexica that has a fix for it. You just need to re-clone Perplexica from Github and follow the installation instructions the issue will be resolved. If you face the error again feel free to re-open the issue
I installed ollama locally but did not use Docker to install Perplexica. The local model is not displayed. I am a computer novice. Please help me! what should I do?