Open juanjuanignacio opened 3 months ago
Same issue :'(
+1
Kinda hacky but you can change {{ raise_exception('System role not supported') }}
in the Gemma2 chat template to {%- set messages = messages[1:] %}
. It works for me with that change.
But I think it would be nice to be able to omit the system message from chat-ui side. Looks like the relevant code is here: https://github.com/huggingface/chat-ui/blob/07c9892722dcce34adc1c64e3c84479a2ca4ee83/src/routes/conversation/+server.ts?plain=1#L46-L56
Opened an issue for a potential solution, feel free to tackle it if you want! :smile: #1432
Hello,
In running chat ui and trying some models, with phi3 and llama i had no problem but when I run gemma2 in vllm Im not able to make any good api request, in env.local: { "name": "google/gemma-2-2b-it", "id": "google/gemma-2-2b-it", "chatPromptTemplate": "{{#each messages}}{{#ifUser}}user\n{{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}}\nmodel\n{{/ifUser}}{{#ifAssistant}}{{content}}\n{{/ifAssistant}}{{/each}}",
"parameters": {
"temperature": 0.1,
"top_p": 0.95,
"repetition_penalty": 1.2,
"top_k": 50,
"truncate": 1000,
"max_new_tokens": 2048,
"stop": [""]
},
"endpoints": [
{
"type": "openai",
"baseURL": "http://127.0.0.1:8000/v1",
}
and I always have the same response in vllm server:
ERROR 08-05 12:39:06 serving_chat.py:118] Error in applying chat template from request: System role not supported INFO: 127.0.0.1:42142 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
do someone know if I have to change and how do change the chat template or deactivate system role ? is it a vllm problem or a chat ui problem?
Thank U!