Closed 100sunnygirl closed 9 months ago
🤖
Based on the error message you're encountering, it seems that the OpenAIFunctionsAgentOutputParser()
is expecting an instance of AIMessage
, but it's receiving an instance of ChatMessage
instead. This is likely due to changes in the LangChain framework in version 0.0.350.
The _parse_ai_message()
method of the OpenAIFunctionsAgentOutputParser
class is where this error is being raised. Here's the relevant code:
def _parse_ai_message(message: BaseMessage) -> Union[AgentAction, AgentFinish]:
"""Parse an AI message."""
if not isinstance(message, AIMessage):
raise TypeError(f"Expected an AI message got {type(message)}")
In previous versions of LangChain, the message passed to the _parse_ai_message()
method might have been an instance of AIMessage
. However, after the update, it seems to be an instance of ChatMessage
.
To resolve this issue, you need to ensure that the message passed to the _parse_ai_message()
method is an instance of AIMessage
. If you're using the ChatMessage
class, you might need to convert it to AIMessage
before passing it to the _parse_ai_message()
method. If this is not possible, you might need to modify the OpenAIFunctionsAgentOutputParser
class to handle ChatMessage
instances.
Unfortunately, I couldn't find specific changes in the LangChain framework in version 0.0.350 that could have affected the behavior of OpenAIFunctionsAgentOutputParser()
. It would be helpful if you could provide more information about how you're using the OpenAIFunctionsAgentOutputParser()
and the ChatMessage
class in your code. This could help identify the exact cause of the issue and find a suitable solution.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
🤖
Hello @qhNana7mi! I'm Dosu, a friendly bot here to assist you with bugs, answer your questions, and guide you in contributing to LangChain. While you're waiting for a human maintainer, feel free to ask me anything. Looking forward to helping you!
Based on the information provided, it seems like the error you're encountering is due to a change in the type of message returned by the ChatGeneration output in LangChain version 0.0.350. The OpenAIFunctionsAgentOutputParser()._parse_ai_message() method expects an instance of AIMessage but it received an instance of ChatMessage. This could be due to the update where the type of message returned might have changed.
Here is the relevant code causing the error:
(Source: libs/langchain/langchain/agents/output_parsers/openai_functions.py)
To resolve this issue, you might need to update the _parse_ai_message() method to accept ChatMessage instances or ensure that the message returned by the ChatGeneration output is an AIMessage instance.
There are similar issues that have been resolved in the LangChain repository, such as The agent run output parser cause error when run a simple quick start. The suggested solution in these issues is to refer to the article on how to handle parsing errors in LangChain. The link to the article is: https://python.langchain.com/docs/modules/agents/how_to/handle_parsing_errors
Please let me know if this helps or if you need further assistance.
Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.