langchain-ai / langchain-nvidia

MIT License
48 stars 15 forks source link

add chat tool calling (invoke only) #72

Closed mattf closed 1 month ago

mattf commented 1 month ago
from langchain_core.pydantic_v1 import Field
from langchain_core.tools import tool
from langchain_nvidia_ai_endpoints import ChatNVIDIA

@tool
def xxyyzz(
    a: int = Field(..., description="First number"),
    b: int = Field(..., description="Second number"),
) -> int:
    """xxyyzz two numbers"""
    return (a**b) % (b - a)

llm = ChatNVIDIA(model=tool_model, **mode).bind_tools(tools=[xxyyzz])
response = llm.invoke("What is 11 xxyyzz 3?", tool_choice="required")
print(response.tool_calls)
[{'name': 'xxyyzz', 'args': {'a': 11, 'b': 3}, 'id': 'chatcmpl-tool-f80cd197c04e41c6be7a86bb20434e01'}]

Note: invoke now returns an AIMessage when a tool call is returned and a ChatMessage otherwise. The old behavior was to always return a ChatMessage. This new behavior is not technically API breaking because invoke declares it returns a BaseMessage and both AIMessage & ChatMessage are children of BaseMessage. However, AIMessage and ChatMessage are siblings and not compatible. ChatMessage has a role property that is not present on either BaseMessage or AIMessage. A user who expected role in the response will be broken by this change.

raspawar commented 1 month ago

Reviewed code impl, looks great!

mattf commented 1 month ago

superseded by https://github.com/langchain-ai/langchain-nvidia/pull/74