langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
89.45k stars 14.12k forks source link

langchain agents executor throws: assert generation is not None #22585

Open archine opened 1 month ago

archine commented 1 month ago

Checked other resources

Example Code

from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain.prompts import ChatPromptTemplate

@tool
def multiply(first_int: int, second_int: int) -> int:
    """Multiply two integers together."""
    return first_int * second_int

@tool
def add(first_int: int, second_int: int) -> int:
    """Add two integers.
    """
    return first_int + second_int

tools = [multiply, add,]

if __name__ == '__main__':

    prompt = ChatPromptTemplate.from_messages(
        [
            ("system", "You are a helpful assistant"),
            ("placeholder", "{chat_history}"),
            ("human", "{input}"),
            ("placeholder", "{agent_scratchpad}"),
        ]
    )

    bind_tools = llm.bind_tools(tools)

    calling_agent = create_tool_calling_agent(llm, tools, prompt)

    agent_executor = AgentExecutor(agent=calling_agent, tools=tools, verbose=True)

    response = agent_executor.invoke({
        "input": "what is the value of multiply(5, 42)?",
    })

Error Message and Stack Trace (if applicable)

Traceback (most recent call last): File "E:\PycharmProjects\agent-tool-demo\main.py", line 61, in stream = agent_executor.invoke({ File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\chains\base.py", line 166, in invoke raise e File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\chains\base.py", line 156, in invoke self._call(inputs, run_manager=run_manager) File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\agents\agent.py", line 1433, in _call next_step_output = self._take_next_step( File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\agents\agent.py", line 1139, in _take_next_step [ File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\agents\agent.py", line 1139, in [ File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\agents\agent.py", line 1167, in _iter_next_step output = self.agent.plan( File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain\agents\agent.py", line 515, in plan for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}): File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 2775, in stream yield from self.transform(iter([input]), config, kwargs) File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 2762, in transform yield from self._transform_stream_with_config( File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 1778, in _transform_stream_with_config chunk: Output = context.run(next, iterator) # type: ignore File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 2726, in _transform for output in final_pipeline: File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 1154, in transform for ichunk in input: File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 4644, in transform yield from self.bound.transform( File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\runnables\base.py", line 1172, in transform yield from self.stream(final, config, kwargs) File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\language_models\chat_models.py", line 265, in stream raise e File "E:\conda\env\agent-tool-demo\lib\site-packages\langchain_core\language_models\chat_models.py", line 257, in stream assert generation is not None AssertionError

Description

An error occurred when I used the agent executor invoke

System Info

langchain 0.2.1

lucas-tucker commented 1 month ago

This seems to be fixed in langchain 0.2.2

archine commented 1 month ago

This seems to be fixed in langchain 0.2.2

No, I upgraded to 0.2.3, and using the official example, assert generation is not None

from langchain.agents import create_tool_calling_agent,AgentExecutor
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI

if __name__ == '__main__':
    llm = ChatOpenAI(...)

    @tool
    def magic_function(input: int) -> int:
        """Applies a magic function to an input."""
        return input + 2

    tools = [magic_function]

    query = "what is the value of magic_function(3)?"

    prompt = ChatPromptTemplate.from_messages(
        [
            ("system", "You are a helpful assistant"),
            ("human", "{input}"),
            # Placeholders fill up a **list** of messages
            ("placeholder", "{agent_scratchpad}"),
        ]
    )

    agent = create_tool_calling_agent(llm, tools, prompt)
    agent_executor = AgentExecutor(agent=agent, tools=tools)

    agent_executor.invoke({"input": query})

Related version

langchain==0.2.3
langchain-community==0.2.3
langchain-core==0.2.3
langchain-openai==0.1.8
lucas-tucker commented 1 month ago

I cannot seem to replicate the error, even after upgrading to 0.2.3. How are you creating llm? Also could you please clarify what the "official example" is?

archine commented 1 month ago

I cannot seem to replicate the error, even after upgrading to 0.2.3. How are you creating llm? Also could you please clarify what the "official example" is?

I re-edited the sample code in my last answer, which is a demo from the dependency library comments。 image