Open chiragjn opened 3 weeks ago
see https://github.com/truefoundry/axolotl/pull/5/commits/5ba183d302ed1c91912555b76e423786acaccae8 for rough implementation I plan to submit this as a PR soon
hi i'm trying to fine-tune gemma 2 for function calling and our tool definitions are in the system prompt but gemma chat template doesn't support system role, how can we update the chat_template to support system role?
jinja2.exceptions.TemplateError: System role not supported
β οΈ Please check that this feature request hasn't been suggested before.
π Feature description
Currently when using
type: chat_template
we have to chose a chat template which are encoded in the axolotl codebase (defaults to chatml)https://github.com/OpenAccess-AI-Collective/axolotl/blob/a82a7115224b7aef14301387a11ad4729fd6ca52/src/axolotl/prompt_strategies/chat_template.py#L99-L102
https://github.com/OpenAccess-AI-Collective/axolotl/blob/a82a7115224b7aef14301387a11ad4729fd6ca52/src/axolotl/prompt_strategies/chat_template.py#L115-L126
https://github.com/OpenAccess-AI-Collective/axolotl/blob/a82a7115224b7aef14301387a11ad4729fd6ca52/src/axolotl/utils/chat_templates.py#L21-L29
I would like to use the tokenizer
tokenizer.chat_template
as that would help the model start from a familiar place.βοΈ Solution
I would like to pass
chat_template
asNone
in which case it should be picked from tokenizer or raise error / fallback to chatml iftokenizer.chat_template
missingI can work on this, seems like a small change.
β Alternatives
No response
π Additional Context
No response
Acknowledgements