-
### Describe the bug
After I reinstalled the webui the core dumps instantly after I sent the llm a message every time. This wasn't the case before. Oddly enough when I used sillytavern to sent a mo…
-
I tried to install a custom pipeline with additional dependencies like described in the readme,
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -e PIPELINES_URLS="https://gi…
-
I am using Ollama with Open WebUI and would love to use that. Both provide OpenAI-compatible endpoints and should work.
I tried entering the Ollama URL HTTP://192.168.2.162:11434 or HTTP://192.168…
-
# Bug Report
## Description
**Bug Summary:**
Pressing the stop button next to the input field doesn't actually stop ollama from generating.
According to this issue, it should be sufficient t…
-
Hello, is it possible for this plugin to support API endpoint from [open-webui](https://github.com/open-webui/open-webui)?
I've been using Obsidian for awhile and keep using OpenAI's API, but now I…
-
-
In the screenshot below, ive searched for a paper that was released a few days ago.
The answer says "I cannot access that link"
This pretty much defeats the purpose of using perplexica
![image](htt…
-
当前选用的模型为glm4-chat,对话时报错,详细的错误为:
```
INFO: 127.0.0.1:62988 - "GET /tools HTTP/1.1" 200 OK
2024-07-16 16:39:32,138 httpx 21692 INFO HTTP Request: GET http://127.0.0.1:7861/tools "H…
-
Something that is missing in `aichat` is the ability to:
- Edit the last answer from LLM
- Continue generation
I believe these are essential features to unlock the capabilities of LLMs, all fam…
-
# Prerequisites
- [x] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- [x] I carefully followed the [README.md](https://github.com/ggerganov/lla…