Open Galaxy-Husky opened 1 day ago
@Galaxy-Husky Ye so the chat template for that model looks incorrect - it should not have {{ '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }} by default, but rather
{% if add_generation_prompt %}{ '<|start_header_id|>assistant<|end_header_id|>\n\n'`
@Galaxy-Husky Ye so the chat template for that model looks incorrect - it should not have
{{ '<|start_header_id|>assistant<|end_header_id|>\\n\\n' }} by default, but rather
{% if add_generation_prompt %}{ '<|start_header_id|>assistant<|end_header_id|>\n\n'`
Yes, I agree.
But 2024.11.5 will help me add {% if add_generation_prompt %}
to fix the template while 2024.11.7 will not and I can't use the model.
Is this the expected behavior?
Hi,
After I upgraded unsloth from 2024.11.5 to 2024.11.7, it raised the error that the tokenizer does not have a {% if add_generation_prompt %}. The model is shenzhi-wang/Llama3.1-8B-Chinese-Chat. It has the following chat template:
Could you check and fix it?