Closed JulienSantiago closed 3 months ago
Hey, @JulienSantiago! I'm here to assist you with any bugs, questions, or contributions you might have.
The error occurs because the model you are using does not support the function calling API. In the first example, the model supports tool calling because it uses a FunctionCallingLLM
with the is_function_calling_model
attribute set to True
. However, in the agentic strategy, the model you are using does not have this attribute set correctly, leading to the ValueError
.
To resolve this issue, ensure that the FunctionCallingLLM
you are using has the is_function_calling_model
attribute set to True
. Here is the relevant code snippet that checks this:
if not llm.metadata.is_function_calling_model:
raise ValueError(
f"Model name {llm.model} does not support function calling API. "
)
Make sure your FunctionCallingLLM
is correctly configured to support function calling. The FunctionCallingLLM
class in the LlamaIndex framework provides methods like chat_with_tools
, achat_with_tools
, get_tool_calls_from_response
, predict_and_call
, and apredict_and_call
to handle tool calling. Ensure these methods are properly implemented and that the is_function_calling_model
attribute is set to True
in the metadata [1][2][3].
To continue talking to Dosu, mention @dosu.
Edit : The difference is that the first code uses a react agent.
And the code
from llama_index.core.agent import ReActAgentWorker
from llama_index.core.agent import AgentRunner
agent_worker = ReActAgentWorker.from_tools(
[get_price_tool, get_cash_tool],
llm=llm,
verbose=True
)
agent = AgentRunner(agent_worker)
Gives no error and works well
Yea the solution here is to use a react agent, function calling worker is only for llms that implement function calling I'm their api (openai, anthropic, etc)
Feature Description
This code works well, and returns :
Which is the correct price for the stock when I write the issue.
However, I am trying to use an agentic strategy in order to answer more complex queries, and the code :
Gives the error :
ValueError Traceback (most recent call last) Cell In[16], line 4 1 from llama_index.core.agent import FunctionCallingAgentWorker 2 from llama_index.core.agent import AgentRunner ----> 4 agent_worker = FunctionCallingAgentWorker.from_tools( 5 [get_price_tool, get_cash_tool], 6 llm=llm, 7 verbose=True 8 ) 9 agent = AgentRunner(agent_worker)
File ~/anaconda3/envs/intellisearch/lib/python3.12/site-packages/llama_index/core/agent/function_calling/step.py:155, in FunctionCallingAgentWorker.from_tools(cls, tools, tool_retriever, llm, verbose, max_function_calls, callback_manager, system_prompt, prefix_messages, kwargs) 151 prefix_messages = [ChatMessage(content=system_prompt, role="system")] 153 prefix_messages = prefix_messages or [] --> 155 return cls( 156 tools=tools, 157 tool_retriever=tool_retriever, 158 llm=llm, 159 prefix_messages=prefix_messages, 160 verbose=verbose, 161 max_function_calls=max_function_calls, 162 callback_manager=callback_manager, 163 kwargs, 164 ) ... 101 ) 102 self._llm = llm 103 self._verbose = verbose
ValueError: Model name /var/model/mistral-7b-instruct-v0.2.Q5_K_M.gguf does not support function calling API.
In the first example, it can call tools, then why can't we use the same strategy for function calling agent workers?
Reason
No response
Value of Feature
No response