Open maziyarpanahi opened 3 months ago
Hi,
I have access to an endpoint coming from vLLM that is serving a Mixtral model. Similar to this discussion https://github.com/huggingface/chat-ui/discussions/1127 I am seeing this error:
ERROR 06-10 20:48:19 serving_chat.py:158] Error in applying chat template from request: Conversation roles must alternate user/assistant/user/assistant/...
However, this seems to be a workaround, my System prompt is empty. In case there is nothing in the system prompt it should not be sent to the vLLM.
Steps to reproduce:
openai
This works without any issue in TGI however.
Hi,
I have access to an endpoint coming from vLLM that is serving a Mixtral model. Similar to this discussion https://github.com/huggingface/chat-ui/discussions/1127 I am seeing this error:
However, this seems to be a workaround, my System prompt is empty. In case there is nothing in the system prompt it should not be sent to the vLLM.
Steps to reproduce:
openai
This works without any issue in TGI however.