-
## 🐛 Bug
I'm trying to follow [this](https://github.com/mlc-ai/notebooks/blob/main/mlc-llm/tutorial_extensions_to_more_model_variants.ipynb) tutorial to get Zephyr-7B(mistral variant) using the lat…
-
### Your current environment
```text
Collecting environment information...
PyTorch version: 2.1.2+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
…
-
### Your current environment
```text
The output of `python collect_env.py`
```
### 🐛 Describe the bug
Recently, we have seen reports of `AsyncEngineDeadError`, including:
- [ ] #5060
…
-
### System Info
Package Version
------------------------ ---------------
accelerate 0.29.0.dev0
aiohttp 3.9.3
aiosignal 1.3.1
annot…
-
# Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [x ] I am running the latest code. Development is very rapid so there are no tagged versions as o…
-
### Describe the bug
When I make a call with OpenAI example code to the server the response returns with the default chat template. I also see the following warning message in the console:
```
…
-
### Your current environment
The output of `python collect_env.py`
WARNING 10-23 23:26:52 _custom_ops.py:19] Failed to import from vllm._C with ModuleNotFoundError("No module named 'vllm._C'")
…
-
## 🐛 Bug
Model initialised successfully and displayed "ready to chat" message. Getting issue on message send.
Stack trace:
File "/mlc-llm/3rdparty/tvm/src/runtime/opencl/opencl_module.cc", line…
-
### Your current environment
```text
The output of `python collect_env.py`
```
### 🐛 Describe the bug
run LLaVA-NeXT | llava-hf/llava-v1.6-mistral-7b-hf
python -m vllm.entrypoints.…
-
### Describe the issue as clearly as possible:
I cannot create an outlines generator with a mistral model and pydantic schema.
### Steps/code to reproduce the bug:
```python
import outlines
from p…