from langchain_core.pydantic_v1 import Field
from langchain_core.tools import tool
from langchain_nvidia_ai_endpoints import ChatNVIDIA
@tool
def xxyyzz(
a: int = Field(..., description="First number"),
b: int = Field(..., description="Second number"),
) -> int:
"""xxyyzz two numbers"""
return (a**b) % (b - a)
llm = ChatNVIDIA(model=tool_model, **mode).bind_tools(tools=[xxyyzz])
invoke -
response = llm.invoke("What is 11 xxyyzz 3?", tool_choice="required")
print(response.tool_calls)
streaming -
from functools import reduce
from operator import add
response = reduce(add, llm.stream("What is 11 xxyyzz 3?", tool_choice="required")
print(response.tool_calls)
Note:stream now returns an AIMessageChunk. The old behavior was to return a ChatMessageChunk. This new behavior is not technically API breaking because stream declares it returns a BaseMessageChunk and both AIMessageChunk & ChatMessageChunk are children of BaseMessageChunk. However, AIMessageChunk and ChatMessageChunk are siblings and not compatible. ChatMessageChunk has a role property that is not present on either BaseMessageChunk or AIMessageChunk. A user who expected role in the response will be broken by this change.
invoke -
streaming -
Note:
stream
now returns anAIMessageChunk
. The old behavior was to return aChatMessageChunk
. This new behavior is not technically API breaking becausestream
declares it returns aBaseMessageChunk
and bothAIMessageChunk
&ChatMessageChunk
are children ofBaseMessageChunk
. However,AIMessageChunk
andChatMessageChunk
are siblings and not compatible.ChatMessageChunk
has arole
property that is not present on eitherBaseMessageChunk
orAIMessageChunk
. A user who expectedrole
in the response will be broken by this change.