langchain-ai / langchain-nvidia

MIT License
50 stars 16 forks source link

add chat tool calling #74

Closed mattf closed 2 months ago

mattf commented 3 months ago
from langchain_core.pydantic_v1 import Field
from langchain_core.tools import tool
from langchain_nvidia_ai_endpoints import ChatNVIDIA

@tool
def xxyyzz(
    a: int = Field(..., description="First number"),
    b: int = Field(..., description="Second number"),
) -> int:
    """xxyyzz two numbers"""
    return (a**b) % (b - a)

llm = ChatNVIDIA(model=tool_model, **mode).bind_tools(tools=[xxyyzz])

invoke -

response = llm.invoke("What is 11 xxyyzz 3?", tool_choice="required")
print(response.tool_calls)

streaming -

from functools import reduce
from operator import add

response = reduce(add, llm.stream("What is 11 xxyyzz 3?", tool_choice="required")
print(response.tool_calls)

Note: stream now returns an AIMessageChunk. The old behavior was to return a ChatMessageChunk. This new behavior is not technically API breaking because stream declares it returns a BaseMessageChunk and both AIMessageChunk & ChatMessageChunk are children of BaseMessageChunk. However, AIMessageChunk and ChatMessageChunk are siblings and not compatible. ChatMessageChunk has a role property that is not present on either BaseMessageChunk or AIMessageChunk. A user who expected role in the response will be broken by this change.