-
Relevant documentation: https://mlc.ai/mlc-llm/docs/get_started/mlc_chat_config.html
-
### Overview
The documentation states that it is possible to use any OpenAI-compatible API.
As I have a local working installation of `text-generation-webui` I attempted to use this with my alread…
-
I generated [model outputs](https://github.com/evalplus/evalplus/files/11841987/samples.jsonl.txt) using [WizardLM/WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) and evalu…
-
### System Info
- `transformers` version: 4.29.2
- Platform: Linux-5.4.0-137-generic-x86_64-with-glibc2.31
- Python version: 3.11.3
- Huggingface_hub version: 0.15.1
- Safetensors version: 0.3.1
…
-
### Problem Description
I'm trying to get AgentLLM to work with Vicuna and WizardLM through Oobabooga and it's very tough to debug without working examples of the exchanges and it would help enable d…
-
**Description**
functions & function_call arguments are currently not implemented in the OpenAI extension.
It would be useful to implement them, as projects like open-interpreter make use of the…
deece updated
9 months ago
-
Using https://docs.gpt4all.io/gpt4all_python.html
-
Hello, I'm really interested in using your extension!! I can load it properly, have done the requirments.txt stuff, but when I begin a chat this is the error I'm getting:
To create a public link, s…
-
### What happened?
I was trying to make the OpenAI Proxy Server on my local to work, following the documentation
1. git clone https://github.com/BerriAI/litellm.git
2. Modify template_secrets.tom…
-
**GPU ENV:**
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.102.04 Driver Version: 450.102.04 CUDA Version: 11.0 |
|-----------------------…