Closed paschembri closed 1 month ago
It may be the JsonOutputFunctionsParser
that is executed and then provides a dict to the Memory backend (and then failing)
Answer generated by a 🤖
To fix the issue with ConversationBufferMemory
failing to capture OpenAI functions messages in LLMChain, you need to modify the _convert_dict_to_message
function in the langchain.chat_models.openai
module to handle the case when the message content is a dictionary. You can do this by checking the message role and creating a FunctionMessage
instance when the role is "function".
Here's the modified _convert_dict_to_message
function:
from langchain.schema import (
AIMessage,
FunctionMessage,
UserMessage,
)
def _convert_dict_to_message(message_dict: dict):
role = message_dict.get("role")
if role == "system":
return SystemMessage(**message_dict)
elif role == "user":
return UserMessage(**message_dict)
elif role == "ai":
return AIMessage(**message_dict)
elif role == "function":
content = message_dict.get("content")
if isinstance(content, dict):
content = json.dumps(content)
return FunctionMessage(name=message_dict.get("name"), content=content)
else:
raise ValueError(f"Unknown message role: {role}")
Now, when you create an LLMChain with both functions and memory, it should generate a FunctionMessage and accept a dictionary or JSON dumps as expected (Source: tests/unit_tests/chat_models/test_openai.py).
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
I'm having the same issue working with an Agent with conversational memory and open source models
same issue here. I can't chain in a SequentialChain both a structured_output_chain (openai functions) and another LLMchain with memory because it fails in the same way as explained here.
ValidationError: 1 validation error for AIMessage
content
str type expected (type=type_error.str)
Encountered a similar issue:
self.messages.append(AIMessage(content=message))
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for AIMessage
content
str type expected (type=type_error.str)
Here's a "solution" (more like a workaround):
Open chat_memory.py
in C:\Users\YOURNAME\Programs\Python\Python311\Lib\site-packages\langchain\memory\chat_memory.py
and replace this part:
class ChatMessageHistory(BaseModel):
messages: List[BaseMessage] = Field(default_factory=list)
def add_user_message(self, message: str) -> None:
self.messages.append(HumanMessage(content=message))
def add_ai_message(self, message: str) -> None:
self.messages.append(AIMessage(content=message))
def clear(self) -> None:
self.messages = []
with:
class ChatMessageHistory(BaseModel):
messages: List[BaseMessage] = Field(default_factory=list)
def add_user_message(self, message: str) -> None:
if not isinstance(message, str):
message = str(message)
self.messages.append(HumanMessage(content=message))
def add_ai_message(self, message: str) -> None:
if not isinstance(message, str):
message = str(message)
self.messages.append(AIMessage(content=message))
def clear(self) -> None:
self.messages = []
@abdulrahimpds but this workaround will fail when attempt to reinstall the pkg correct? the issue appears when i tried to pass an extra variable results = self.agent_chain.run(input={ "message": message, "phone_number": phone_number, })
@ferasawadi The workaround essentially converts any non-string message
to a string before adding it to the messages
list, which addresses the ValidationError
. This alteration in the add_ai_message
method ensures that the message
is always a string type, thus preventing the error. You're correct that reinstalling the package would overwrite this change, but this modification is just a temporary fix. (Devs should notice this issue and get it fixed ASAP)
In your case, when passing an extra variable (phone_number
), you should ensure that the input
parameter is a string type to avoid similar issues. You can do so by converting the input
parameter to a string as shown:
results = self.agent_chain.run(input=str({
"message": message,
"phone_number": phone_number,
}))
This will convert the entire input
dictionary to a string, ensuring that the message
is a string type as expected by the add_ai_message
method.
@ferasawadi For a permanent solution, developers should modify the ChatMessageHistory
class to ensure both input
and output
messages are always converted to string types, preventing the ValidationError
. The revised add_user_message
and add_ai_message
methods should check the message type and convert it to a string if it's not already, as shown below:
class ChatMessageHistory(BaseModel):
messages: List[BaseMessage] = Field(default_factory=list)
def add_user_message(self, message: str) -> None:
if not isinstance(message, str):
message = str(message)
self.messages.append(HumanMessage(content=message))
def add_ai_message(self, message: str) -> None:
if not isinstance(message, str):
message = str(message)
self.messages.append(AIMessage(content=message))
def clear(self) -> None:
self.messages = []
This adjustment will handle the ValidationError
by ensuring that messages are always of string type, negating the need for manual conversion and addressing the issue at its root.
Hi, @paschembri,
I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue involves the failure of ConversationBufferMemory to capture OpenAI function messages in LLMChain due to the generation of AIMessage
instead of FunctionMessage
. The issue has been tested on version 0.0.215
and is seeking assistance from specific individuals. Several users have reported encountering the same issue and have provided workarounds and potential solutions, including modifying the _convert_dict_to_message
function and the ChatMessageHistory
class.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days.
Thank you for your understanding and cooperation.
Getting this error in 0.1.0 as well.
Changing add_ai_message in langchain_core/chat_history.py to
def add_ai_message(self, message: Union[AIMessage, str]) -> None:
"""Convenience method for adding an AI message string to the store.
Args:
message: The AI message to add.
"""
if isinstance(message, AIMessage):
self.add_message(message)
else:
if type(message)==str:
self.add_message(AIMessage(content=message))
elif type(message)==dict:
import json
self.add_message(AIMessage(content=json.dumps(message)))
seems to fix the issue
I'm experiencing this issue with version langchain==0.1.13 too. How can I fix that? Thank you
This still exists in langchain_0.1.13, as stated above.
The issue is coming all the way from BaseChain and BaseChatMemory.
The type-hints need to be fixed in both of those classes, since they expect output_str == str, but the output_str comes from a dictionary and could be a compound dictionary (as is the case here), meaning output_str == dict (currently).
Instead of editing source-code that'll be broken by updates, here is a temporary fix using inheritance method override:
class PatchedConversationBufferMemory(ConversationBufferMemory):
def save_context(self, inputs, outputs) -> None:
"""Save context from this conversation to buffer."""
input_str, output_str = self._get_input_output(inputs, outputs)
output_str = output_str if isinstance(output_str, str) else str(output_str) # this line is the temp-fix
self.chat_memory.add_messages(
[HumanMessage(content=input_str), AIMessage(content=output_str)]
)
# so now, instead of creating memory from ConversationBufferMemory, you just use PatchedConversationBufferMemory
PR: 19769 PR rejected 🤷
System Info
Adding memory to a LLMChain with OpenAI functions enabled fails because of
AIMessage
are generated instead ofFunctionMessage
where
AIMessage.content
is in fact a dict.Tested version :
0.0.215
Who can help?
@dev2049 @hwchase17
Information
Related Components
Reproduction
Expected behavior
When LLMChain is created with
functions
and memory, generate FunctionMessage and accept dict (or json dumps)