Closed sjrl closed 9 months ago
Great find @sjrl , we need this for 1.x and 2.0 as well.
Just as a heads up it looks like this feature might only be in main (at least according to their current docs https://huggingface.co/docs/transformers/main/chat_templating#templates-for-chat-models) so we might need to wait on this.
I am interested in using that feature with PromptNode. It is now released with v4.34.0 (https://huggingface.co/docs/transformers/v4.34.0/en/chat_templating). What is the recommended way of using it?
Any updates on this @sjrl @vblagoje?
Can you give me same example but for CPU ?
Feature Request Transformers recently added a new feature called
which auto-applies the right formatting around the messages for models.
Using this could greatly improve the user experience of open-source LLMs since users would no longer have to manually add the correct tokens and formatting to prompts sent to the PromptNode.
Here is a full example of how to use this new function with the Mistral Instruct Model
Alternative Rely on users to manually add the correct tokens to the prompt sent to the PromptNode