Open yildirimgoks opened 1 day ago
@yildirimgoks, this is intentional. ToolCall
is an internal way for representation of AI Tool Call Message. At the end Tool Calling API for OpenAI is invoked with the compatible schema. But if you want to achieve the desire result:
from langchain_core.messages import AIMessage, ToolCall
tool_call = {
"id":"tc1",
"type":"function",
"name":"some_function",
"args":{"some_argument": "some_value"}
}
# line below works fine
ToolCall(**tool_call)
# line below returns an error
AIMessage(content="", tool_calls=[ToolCall(**tool_call)])
Checked other resources
Commit to Help
Example Code
Description
When you provide a value for the '_toolcalls' argument of the AIMessage, _rootvalidator tries to parse the values in it to a ToolCall object using the _langchain_core.messages.tool.toolcall function. Although instancing ToolCall by calling it directly with OpenAI compatible tool_calls model is possible, the _toolcall() function expects a specific set of variables to build the ToolCall, and does not accept extra keyword variables. This enforces a very specific data model when tool_calls is provided with AIMessage and makes it impossible to create an AIMessage object with the desired _toolcalls formatting.
Is this intended or could this be a bug? IIs there a different way of creating an AI message with tool_calls that I'm missing?
System Info
langchain==0.2.14 langchain-community==0.2.12 langchain-core==0.2.33 langchain-openai==0.1.22 langchain-text-splitters==0.2.2
Checked the latest source code & documentation and the issue persists in the most recent version as well.
Originally posted by @yildirimgoks in https://github.com/langchain-ai/langchain/discussions/27919