Arize-ai / openinference

Auto-Instrumentation for AI Observability
https://arize-ai.github.io/openinference/
Apache License 2.0
146 stars 22 forks source link

[bug] [langchain] Cannot parse message of type: ToolMessage #519

Closed yeesian closed 3 weeks ago

yeesian commented 3 weeks ago

Describe the bug

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/openinference/instrumentation/langchain/_tracer.py", line 274, in wrapper
    yield from wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/openinference/instrumentation/langchain/_tracer.py", line 419, in _parse_message_data
    raise ValueError(f"Cannot parse message of type: {message_class_name}")
ValueError: Cannot parse message of type: ToolMessage

Expected behavior No ValueError to be raised.

Additional context I think _parse_message_data in https://github.com/Arize-ai/openinference/blob/21b83e9207e68d561a26e4a4b5a7bb373cc2892b/python/instrumentation/openinference-instrumentation-langchain/src/openinference/instrumentation/langchain/_tracer.py#L400 just needs to be updated to include ToolMessage (source)

axiomofjoy commented 3 weeks ago

@yeesian Thanks for the issue report! Do you have a snippet to reproduce?

yeesian commented 3 weeks ago

Does the following work for you?


Installation:

!pip3 install --upgrade --quiet \
    langchain \
    langchain-google-vertexai \
    openinference-instrumentation-langchain \
    opentelemetry-sdk \
    requests

Reproducing the issue:

from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

from langchain import agents
from langchain_google_vertexai import ChatVertexAI
from langchain_core.prompts import ChatPromptTemplate
from langchain.tools import tool

tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))

LangChainInstrumentor().instrument()

@tool
def get_exchange_rate(
    currency_from: str = "USD",
    currency_to: str = "EUR",
    currency_date: str = "latest",
):
    """Retrieves the exchange rate between two currencies on a specified date."""
    import requests
    response = requests.get(
        f"https://api.frankfurter.app/{currency_date}",
        params={"from": currency_from, "to": currency_to},
    )
    return response.json()

model = "gemini-1.5-pro-001"
llm = ChatVertexAI( # Or a ChatModel that supports tool-calling in https://python.langchain.com/v0.1/docs/integrations/chat/
    model_name=model,
    project="your-gcp-project", # https://cloud.google.com/resource-manager/docs/creating-managing-projects
    max_tokens=500,
    temperature=0.5,
)

tools = [get_exchange_rate]

prompt = ChatPromptTemplate.from_messages([
    ("human", "{input}"),
    ("placeholder", "{agent_scratchpad}"),
])

agent = agents.create_tool_calling_agent(llm, tools, prompt)
agent_executor = agents.AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "What is the exchange rate from US dollars to Swedish currency today?"})
RogerHYang commented 3 weeks ago

@yeesian Thank you for bring this to our attention! This should be fixed in the latest release.

openinference-instrumentation-langchain>=0.1.19

yeesian commented 3 weeks ago

Thank you for the quick fix! I can confirm that it is working on my end with >=0.1.19 too

axiomofjoy commented 3 weeks ago

Awesome, thanks @yeesian

rhlarora84 commented 6 days ago

I am getting similar errors. Is there a compatibility issue with OpenAI inputs (vision models)

image
axiomofjoy commented 6 days ago

@rhlarora84 Can you file us an issue with a snippet to reproduce?