-
### Describe the bug
When running the file "start_windows.bat". It says that the OK command is unrecognized. When running the .bat again:
********************************************* * **********…
-
### Describe the bug
Adding or removing any extension or flags works fine. The problem arrises when I enable the "Listen" flag and clock on "Apply flags/extensions and restart", which yields a red "e…
-
Could we have support for [Llama.cpp?](https://github.com/ggerganov/llama.cpp)
That will make the model more accessible to many popular tools like Ollama, LM Studio, Koboldcpp, text-generation-webui,…
-
Great interface!
Are there any plans to support text-generation-webui as a backend?
https://github.com/oobabooga/text-generation-webui
-
### Describe the bug
Hi all, I still meeting with problem like in thread https://github.com/oobabooga/text-generation-webui/issues/5123 - OSError: [WinError 126] The specified module could not be fou…
-
### Describe the bug
Failed to build the chat prompt.
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Reproduction
Failed to build the chat prompt.
### Scr…
-
Can we forego the loading of models until we have a menu?
I dont want to have 50GB of models when I only need one. bunny or phi or deepseekvl
PaliGemma seems to create this error as I didnt downlo…
-
**Description**
Sometimes you have crashes like the ones below.
`ui_model_menu.py` is quite nice and dandy by reporting the python stack when it can, but as you can see below if you were only to …
mirh updated
2 months ago
-
### Describe the bug
There is an error when trying load a model
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Reproduction
Load model with ExLlamav2_HF M…
-
### Environment
🪟 Windows
### System
Brave Browser
### Version
Version 1.12.4
### Desktop Information
nodejs version v20.10.0
Textgen (ooba) (not sure, but I installed it two days ago, along w…