Open CISC opened 1 month ago
Interesting. Hadn't caught that. Could you show an example of listing all chat templates in a gguf file in either pseudo code or python? I'll update the code from that and add you as co-author on the PR.
Sure, see this line from my commit to llama-cpp-python
:
https://github.com/abetlen/llama-cpp-python/blob/5ab40e6167e64ecb06c2a4d8ae798391bf9a5893/llama_cpp/llama.py#L420
Basically it just gets all the metadata entries that starts with tokenizer.chat_template.
(due to the current inability to read tokenizer.chat_templates
array).
Since ggerganov/llama.cpp#6588
GGUF
s can contain multiple chat templates, it might be a good idea to check those too.Yes, I'm aware that it's a very narrow window for non-default chat templates to be exploitable in
llama-cpp-python
since support got added only a few hours prior to the fix, and both got released in the same version, however for completeness and curiosity's sake I'd still like to see it added. :)Unfortunately my rustacean is weak so I don't feel comfortable making a PR.