Closed t-ega closed 1 year ago
🤖
Hello,
Thank you for bringing this issue to our attention. It seems like the problem you're experiencing is related to the memory handling in the LangChain framework. Specifically, the run
method in the ConversationChain
class is not implemented to save the context and return the memory, which is why the model doesn't recognize previously mentioned information like a user's name.
Here are a few similar issues that have been solved in the past:
Initialize_agent not storing messages when memory is present: The issue was related to how the save_context
method in the ConversationBufferMemory
class was functioning. The solution was to ensure that the keys used in self._get_input_output(inputs, outputs)
correctly correspond to the keys used to store the input and output strings in the inputs
and outputs
dictionaries. Additionally, the add_user_message
and add_ai_message
methods needed to be correctly adding the messages to the agent's memory.
Does the new Structured Chat Agent support ConversationMemory?: The solution was to add chat history to the STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION
agent with the following code:
chat_history = MessagesPlaceholder(variable_name="chat_history")
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
agent = initialize_agent(
tools,
llm,
agent=AgentType.STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION,
verbose=True,
agent_kwargs={
"memory_prompts": [chat_history],
"input_variables": ["input", "agent_scratchpad", "chat_history"]
},
memory=memory,
)
Based on these solutions, I would recommend checking the keys used in your inputs
and outputs
dictionaries and ensuring that the add_user_message
and add_ai_message
methods are correctly adding the messages to the agent's memory. Additionally, you might want to add more clarity to your prompt to distinguish between new input and chat history.
I hope this helps! If you have any other questions or need further clarification, please don't hesitate to ask.
Best, Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
System Info
langchain version == 0.0.271
Who can help?
@hwchase17
Information
Related Components
Reproduction
chat_history = MessagesPlaceholder(variable_name="chat_history") memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
llm = ChatOpenAI(temperature=0, model_name='gpt-3.5-turbo') agent_chain = initialize_agent( tools=tools, llm=llm, agent=AgentType.STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION, verbose=True, memory=memory, max_iterations=1, early_stopping_method='generate', handle_parsing_errors=True, system_prompt="You are an assistant named Mikey", agent_kwargs={ "memory_prompts": [chat_history], "input_variables": ["input", "agent_scratchpad", "chat_history"], } ) agent_chain.run(input="My name is Dev")
agent_chain.run(input="What is my name")
Expected behavior
Expected behavior final answer: Your name is Dev
Output behavior: Final answer: I don't have access to personal information like your name. Is there anything else I can help you with?
Note: What happens is that the memory is not being passed along, so the model doesn't know my name