-
[oobabooga/text-generation-webui](/oobabooga/text-generation-webui), when run with the `--api` flag, publishes a locally available OpenAI API at http://127.0.0.1:5000/v1/. But trying to change the end…
-
I try to use Koboldcpp's OpenAI compatible API in the Custom Local (OpenAI format) section, but it is not working. I input the model name, protocol and the port number. Please let me know if you need …
-
### Before submitting your bug report
- [ ] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [ ] I'm not able to find an [open issue]…
-
```
D:\llama.cpp\models>..\build\install\bin\main.exe -m qwen1_5-4b-chat-q4_0.gguf -cml --color -i
Log start
main: build = 2725 (784e11de)
main: built with MSVC 19.35.32215.0 for
main: seed = 17…
-
Hi brucepro
I install your extension (docker version) however keep getting notice no python_on whale when I already install everything in requirement.txt
Here my log confirm that I already insta…
-
**Describe the bug**
It is possible for Continue to infinitely loop, printing the same output over and over again.
**To Reproduce**
Steps to reproduce the behavior:
1. Using `codellama` via `oll…
-
### Describe the bug
MinP is now supported by Exllamav2, but MinP is still being appended to the logits processors in the sampler_hijack.py file. Unless there is code somewhere that fixes this like r…
-
i'm having this error, can anyone help me?
i had this error:
Traceback (most recent call last):
File "E:\IA\Texto\Oobabooga\text-generation-webui\modules\extensions.py", line 37, in load_extensio…
-
**Describe the bug**
I've setup ggml in `config.py` like so,
```python
default=GGML(
max_context_length=2048,
server_url="http://192.168.1.19:8000"
)
```…
-
Hello, just wanted to say this is a great addon to talk to local LLMs! One thing I am running into is it seems when the prompt is sent to text-generation-webui, it is adding additional formatting ( an…