huggingface / chat-ui

Open source codebase powering the HuggingChat app
https://huggingface.co/chat
Apache License 2.0
7.3k stars 1.06k forks source link

Sending system_prompt to LLM even if it's empty #1263

Open maziyarpanahi opened 3 months ago

maziyarpanahi commented 3 months ago

Hi,

I have access to an endpoint coming from vLLM that is serving a Mixtral model. Similar to this discussion https://github.com/huggingface/chat-ui/discussions/1127 I am seeing this error:

ERROR 06-10 20:48:19 serving_chat.py:158] Error in applying chat template from request: Conversation roles must alternate user/assistant/user/assistant/...

However, this seems to be a workaround, my System prompt is empty. In case there is nothing in the system prompt it should not be sent to the vLLM.

Steps to reproduce:

This works without any issue in TGI however.