Open ahuang11 opened 9 months ago
Its just using the langchain templating so langchain_community is doing all the work converting every model provider.
I don't think so; I think funcchain is hardcoding prompt formats. https://github.com/shroominic/funcchain/blob/main/src/funcchain/model/patches/llamacpp.py#L269-L283
Llama-cpp-python actually supports different chat_formats
yess true we discussed in discord need to look into that!
Hi again. I'm wondering, does funcchain handle different chat templates internally?
e.g. for Llama
vs Mistral
References: https://www.reddit.com/r/LocalLLaMA/comments/1afweyw/comment/kofabzx/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button https://www.promptingguide.ai/models/mistral-7b#chat-template-for-mistral-7b-instruct