Mobile-Artificial-Intelligence / maid_llm

maid_llm is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid)
MIT License
29 stars 8 forks source link

[Feature Request] Take chat template from tokenizer.chat_template key from the gguf file #3

Closed qnixsynapse closed 1 month ago

qnixsynapse commented 2 months ago

Currently only chatML and alapaca is available which I assume are predefined. The best it to take it from the gguf itself. In this way more model will be supported.