langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
95.21k stars 15.44k forks source link

Given example fails #27895

Open twobitunicorn opened 2 weeks ago

twobitunicorn commented 2 weeks ago

Checked other resources

Example Code

import pandas as pd
from langchain_openai import ChatOpenAI
from langchain_core.output_parsers import JsonOutputParser
from langchain_experimental.agents.agent_toolkits import create_pandas_dataframe_agent

# Step 1: Load your DataFrame
df = pd.read_csv("https://raw.githubusercontent.com/pandas-dev/pandas/main/doc/data/titanic.csv")

# Step 2: Initialize the LLM
model = ChatOpenAI(temperature=0)

# Step 3: Define the JsonOutputParser
parser = JsonOutputParser()

# Step 4: Create a Pandas DataFrame Agent
agent = create_pandas_dataframe_agent(model, df, verbose=True)

# Step 5: Chain the components
chain = agent | parser

# Step 6: Invoke the chain with a query
result = chain.invoke({"query": "How many people survived?"})

# Output the result
print(result)

Error Message and Stack Trace (if applicable)

new AgentExecutor chain... Thought: To find out how many people survived, I need to sum the 'Survived' column in the dataframe. Action: python_repl_ast Action Input: df['Survived'].sum()342I now know the final answer Final Answer: 342 people survived.

Finished chain. Traceback (most recent call last): File "/Users/joseph/Projects/chat_bot/scratch.py", line 29, in result = chain.invoke("How many people survived?") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3024, in invoke input = context.run(step.invoke, input, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/output_parsers/base.py", line 202, in invoke return self._call_with_config( ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1927, in _call_with_config context.run( File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 396, in call_func_with_variable_args return func(input, *kwargs) # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^ File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/output_parsers/base.py", line 203, in lambda inner_input: self.parse_result([Generation(text=inner_input)]), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/langchain_core/load/serializable.py", line 125, in init super().init(args, **kwargs) File "/Users/joseph/Projects/chat_bot/.venv/lib/python3.12/site-packages/pydantic/main.py", line 212, in init validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ pydantic_core._pydantic_core.ValidationError: 1 validation error for Generation text Input should be a valid string [type=string_type, input_value={'input': 'How many peopl... '342 people survived.'}, input_type=dict] For further information visit https://errors.pydantic.dev/2.9/v/string_type

Description

I was getting that failure in my code and went to the site and chatted with the input how can I chain an Agent with a JsonOutputParser

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 24.1.0: Thu Oct 10 21:03:11 PDT 2024; root:xnu-11215.41.3~2/RELEASE_ARM64_T6020 Python Version: 3.12.7 (main, Oct 8 2024, 17:22:48) [Clang 16.0.0 (clang-1600.0.26.3)]

Package Information

langchain_core: 0.3.15 langchain: 0.3.7 langchain_community: 0.3.4 langsmith: 0.1.139 langchain_experimental: 0.3.2 langchain_huggingface: 0.1.2 langchain_ollama: 0.2.0 langchain_openai: 0.2.5 langchain_text_splitters: 0.3.2

Optional packages not installed

langgraph langserve

Other Dependencies

aiohttp: 3.10.10 async-timeout: Installed. No version info available. dataclasses-json: 0.6.7 httpx: 0.27.2 httpx-sse: 0.4.0 huggingface-hub: 0.26.2 jsonpatch: 1.33 numpy: 1.26.4 ollama: 0.3.3 openai: 1.53.0 orjson: 3.10.10 packaging: 24.1 pydantic: 2.9.2 pydantic-settings: 2.6.1 PyYAML: 6.0.2 requests: 2.32.3 requests-toolbelt: 1.0.0 sentence-transformers: 3.2.1 SQLAlchemy: 2.0.36 tenacity: 9.0.0 tiktoken: 0.8.0 tokenizers: 0.20.1 transformers: 4.46.1 typing-extensions: 4.12.2

twobitunicorn commented 2 weeks ago

This code does work.

import pandas as pd
from langchain_openai import ChatOpenAI
from langchain_core.output_parsers import JsonOutputParser
from langchain_experimental.agents.agent_toolkits import create_pandas_dataframe_agent

# Step 1: Load your DataFrame
df = pd.read_csv("https://raw.githubusercontent.com/pandas-dev/pandas/main/doc/data/titanic.csv")

# Step 2: Initialize the LLM
model = ChatOpenAI(temperature=0)

# Step 3: Create a Pandas DataFrame Agent
agent = create_pandas_dataframe_agent(model, df, verbose=True)

# Step 4: Define the JsonOutputParser
parser = JsonOutputParser()

# Step 5: Invoke the agent with a query
query = "How many people survived?"
result = agent.invoke(query)

# Step 6: Ensure the result is a string before parsing
if isinstance(result, dict) and 'output' in result:
    output_text = result['output']
else:
    output_text = result

# Step 7: Parse the output
parsed_result = parser.parse(output_text)

# Output the result
print(parsed_result)
eyurtsev commented 1 week ago

Hi @twobitunicorn where did you find the example?

  1. We generally do not recommend using code from langchain_experimental without proper sandboxing.
  2. If you want to implement a pandas dataframe agent, I'd suggest creating something custom using langgraph.