-
### Describe the bug
ERROR:Failed to load the extension "superbooga".
Traceback (most recent call last):
File "F:\oobabooga-windows\text-generation-webui\modules\extensions.py", line 34, in load_…
-
vLLM is an open-source LLM inference and serving library that accelerates HuggingFace Transformers by 24x and powers Vicuna and Chatbot Arena.
Blog post: https://vllm.ai/
Repo: https://github.com/…
-
### Describe the bug
following guide https://github.com/oobabooga/text-generation-webui/wiki/09-%E2%80%90-Docker
on step `cp .env.example .env`
there is no env.example file
### Is there an existin…
-
I'm having problems related to installing and makingv theqdrant work, I've never really used them and I have no idea how to configure it, It would be great if there was a tutorial on YouTube or someth…
-
Hi,
I think you have not made it clear at all that this code as-is cannot be run locally, and that it relies on HF remote inference. I only realized that when I finally got it to run and it asked m…
-
### Describe the bug
## Description
I installed with one click installer on Ubuntu 20.04 and I called the conda environment with
```bash
source "./installer_files/conda/etc/profile.d/conda.sh" &&
…
B0-B updated
6 months ago
-
I was able to install the llama-cpp-python server with pip and local LLM plugin from BRAT. I keep getting the following error when I try to use LLM Instruction:
`Error: SyntaxError: Unexpected non-…
-
Is there any way I can load the character I made in oobabooga webui?
Really nice job btw, thanks!
-
I startup the local llm with JAN.AI.
http://127.0.0.1:1337/v1/chat/completions
model: openchat-3.5-7b
I follow the guide , test the api is work.
https://github.com/oobabooga/text-generation-web…
-
我试了很多参数组合都加载不成功。