Closed yeesian closed 3 weeks ago
@yeesian Thanks for the issue report! Do you have a snippet to reproduce?
Does the following work for you?
Installation:
!pip3 install --upgrade --quiet \
langchain \
langchain-google-vertexai \
openinference-instrumentation-langchain \
opentelemetry-sdk \
requests
Reproducing the issue:
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from langchain import agents
from langchain_google_vertexai import ChatVertexAI
from langchain_core.prompts import ChatPromptTemplate
from langchain.tools import tool
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
LangChainInstrumentor().instrument()
@tool
def get_exchange_rate(
currency_from: str = "USD",
currency_to: str = "EUR",
currency_date: str = "latest",
):
"""Retrieves the exchange rate between two currencies on a specified date."""
import requests
response = requests.get(
f"https://api.frankfurter.app/{currency_date}",
params={"from": currency_from, "to": currency_to},
)
return response.json()
model = "gemini-1.5-pro-001"
llm = ChatVertexAI( # Or a ChatModel that supports tool-calling in https://python.langchain.com/v0.1/docs/integrations/chat/
model_name=model,
project="your-gcp-project", # https://cloud.google.com/resource-manager/docs/creating-managing-projects
max_tokens=500,
temperature=0.5,
)
tools = [get_exchange_rate]
prompt = ChatPromptTemplate.from_messages([
("human", "{input}"),
("placeholder", "{agent_scratchpad}"),
])
agent = agents.create_tool_calling_agent(llm, tools, prompt)
agent_executor = agents.AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "What is the exchange rate from US dollars to Swedish currency today?"})
@yeesian Thank you for bring this to our attention! This should be fixed in the latest release.
openinference-instrumentation-langchain>=0.1.19
Thank you for the quick fix! I can confirm that it is working on my end with >=0.1.19
too
Awesome, thanks @yeesian
I am getting similar errors. Is there a compatibility issue with OpenAI inputs (vision models)
@rhlarora84 Can you file us an issue with a snippet to reproduce?
Describe the bug
Expected behavior No
ValueError
to be raised.Additional context I think
_parse_message_data
in https://github.com/Arize-ai/openinference/blob/21b83e9207e68d561a26e4a4b5a7bb373cc2892b/python/instrumentation/openinference-instrumentation-langchain/src/openinference/instrumentation/langchain/_tracer.py#L400 just needs to be updated to includeToolMessage
(source)