Closed NourOM02 closed 1 week ago
Could you please try llm = ChatHuggingFace(llm=llm, tokenizer=llm.pipeline.tokenizer)
?
FYI: updated the docs for easier debugging in https://github.com/huggingface/transformers/pull/33652
I tested your solution which causes no errors, but the model isn't able to make any tool calls (it finish the run within one completion even though the agent should be more robust). I tried ollama to see if the problem is the model's abilities which wasn't the case.
google/gemma-2-2b-it does not support system calls: "Template error: syntax error: System role not supported"
I also tried other models (e.g. meta-llama), and other methods (e.g. HuggingFaceEndpoint), and they work with ChatHuggingFace.
At this point it seems to me like a model problem, and I would recommend that you provide a MRE, preferably with a model which works / used to work with some early version.
Imho, the original issue (chat_template) raised seems to be resolved.
URL
https://python.langchain.com/docs/tutorials/sql_qa/
Checklist
Issue with current documentation:
Goal
Create a SQL agent that ineracts with a SQL database using a local model.
My implementation
I am trying to use a local model from huggingface and then create a ChatModel instance using ChatHuggingFace class. I implemented the same code for the agent as explained in the above tutorial, with the necessary changes to work with a huggingface model.
Configuring LangSmith and HuggingFace tokens
Needed packages
Setup the connection to the database
Setup the LLM
Build hugging face pipeline to use the model with the langchain package
Create Agent
Run the agent
Expected behaviour
As demonstrated by the tutorial, steps taken by the LLM should be displayed before getting the final answer.
Actual behaviour
ValueError: Cannot use apply_chat_template() because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating
Additional experiments
I tried to visualize the chat_template in the loaded model using :
I get the following template (which means everything is good): "{{ bos_token }}{% if messages[0]['role'] == 'system' %}{{ raise_exception('System role not supported') }}{% endif %}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% if (message['role'] == 'assistant') %}{% set role = 'model' %}{% else %}{% set role = message['role'] %}{% endif %}{{ '' + role + '\n' + message['content'] | trim + '\n' }}{% endfor %}{% if add_generation_prompt %}{{'model\n'}}{% endif %}"
However, when I try to access the chat template after intializing ChatHuggingFace I notice that there is no chat_template, using :
My conclusion is there is a problem with
ChatHuggingFace
that makes thechat_template
missing !Idea or request for content:
No response