abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
7.2k stars 856 forks source link

Models with multiple chat templates #1336

Open CISC opened 3 months ago

CISC commented 3 months ago

Not an issue yet, but will need to be handled once this is implemented based on recent transformers changes.

Also note the kwargs change in the same PR which will be used by f.ex. C4AI Command R models (new chat template is not merged yet) to pass along tools and documents, while we already support tools, it might be worthwhile to support other things.

CISC commented 2 months ago

Any suggestions on how to approach this? It has been merged in llama.cpp a while now, and many GGUFs already have the new metadata.

I suppose adding f.ex. a chat_template_name parameter and applying the chosen template (if found - should also output which templates are available (from tokenizer.chat_templates list) I guess) would be the initial step.

For server this gets more complicated, it would probably make sense to allow the caller to choose a template, and then also have an endpoint to see which templates are available?

Finally, how would you go about adding support for additional parameters to the template, like documents in the rag template?

abetlen commented 2 months ago

@CISC do you mind posting a gguf that uses this right now.

Yeah I think we can do even more simple and not introduce any new parameters just use the existing chat_format. The default when no chat_format is specified is to use default then the others can just be specified there by name.

The chat formats will be accessible through the metadata not sure if we need to add anything new there but we should add an option to change chat format after initialization (I believe this has already been requested before).

CISC commented 2 months ago

Sure, pmysl was the first one to update their quants. If R+ is a bit too hefty, try LlamaEdge's Command R quant.

My main worry about using chat_format is that it might conflict with an existing choice, albeit unlikely.

abetlen commented 2 months ago

@CISC good point, let's prefix these dynamically loaded chat templates with chat_template so chat_template.rag or chat_template.tool_use for the cohere model.

CISC commented 2 months ago

@abetlen That seems reasonable, I'm thinking registering chat_template.default etc. as chat format at init with the Jinja2 handler setup done as fallback today and then just fall back to chat_template.default(if registered) instead.

CISC commented 1 month ago

WIP changes worth paying attention to: huggingface/transformers#30621

CISC commented 4 weeks ago

Another related PR is this one huggingface/transformers#31429 which could be nice to replicate here, however requires us to differentiate from specifically selecting chat_template.default and defaulting to it as we may not want to force chat_template.tool_use just because tools are passed.