Closed imtuyethan closed 1 week ago
Need to review the stop token and promt template.
<|system|>\n{system_message}<|user|>\n{prompt}<|assistant|>
, but when I use jinja2 tool to render chat template from source it should look like this \n\n<|system|>\n{system_message}</s>\n\n\n\n\n<|user|>\n{prompt}</s>\n\n\n<|assistant|>\n\n
.</s>
. It is infer from hereI used model from above source with the correct prompt template + stop token, it still have buggy answer. I think this bug can be caused by model's quality.
thanks for the thorough investigation @nguyenhoangthuan99 🙏 closing as its likely a tinyllama quality issue at this point.
https://github.com/user-attachments/assets/f8a6eed0-58fd-4a1a-a779-3e7ad5e75675