Closed tomiezhang closed 10 months ago
๐ค
Hello @tomiezhang ๐,
Nice to meet you! I'm Dosu, a friendly bot here to assist you with your programming woes, answer your queries, and guide you to become a contributor to this repository. While we wait for a human maintainer, feel free to ask me any questions. I'm available round the clock to make your experience smoother! ๐
Based on the code you've shared, it seems like you're trying to use the ConversationBufferMemory
for storing the conversation history. However, it's not clear from your message what exactly is not working with the memory. Are you getting any error messages? Or is the memory not storing the conversation history as expected?
From the context provided, there are a few similar issues that have been solved in the LangChain repository. One of them is Agent does not recognize chat history (Missing some input keys: {'chat_history'}). The user in this issue was able to resolve it by using the ConversationalChatAgent.create_prompt
method instead of the ZeroShotAgent.create_prompt
method. Here is the code snippet they used:
prompt = ConversationalChatAgent.create_prompt(
tools,
system_message=prefix + format_instructions
)
llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo")
memory = ConversationBufferMemory(memory_key='chat_history')
agent = initialize_agent(
tools=tools,
llm=llm,
agent_type=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
verbose=True,
prompt=prompt,
memory=memory,
handle_parsing_errors=True,
max_iterations=5
)
Another similar issue is How to add conversational memory to pandas toolkit agent?. The user in this discussion was able to add conversational memory to a pandas toolkit agent and shared their solution on StackOverflow.
If these solutions don't help, could you please provide more details about the problem? Specifically, what is the expected behavior and what is the actual behavior? Any error messages would also be helpful.
Regarding your feedback about the LangChain documentation, I understand your frustration. As a maintainer, I'm always looking for ways to improve the documentation and make it more user-friendly. If you have any specific suggestions or areas you think could be improved, please let me know. Your feedback is valuable in helping us improve the LangChain framework.
I hope this helps! If you have any more questions or need further clarification, feel free to ask.
-Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a ๐ if this is helpful and ๐ if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
You need to pass in the MessagesPlaceholder
for the memory variable:
prompt = conversational_agent.agent.create_prompt(
system_message=sys_prompt,
extra_prompt_messages=[MessagesPlaceholder("history")],
tools=self.get_tools(),
)
thank you bro, @keenborder786 but it cant work for me,and i use the https://python.langchain.com/docs/modules/agents/how_to/add_memory_openai_functions demo, its not work too, here is my code:
class SimpleChat:
def __init__(self) -> None:
self.llm = ChatOpenAI(
temperature=0,
model="gpt-3.5-turbo-0613",
openai_api_key="sk-",
openai_api_base = "https://",
)
self.agent = self.get_agent()
def get_tools(self):
search = SerpAPIWrapper(serpapi_api_key="")
llm_math_chain = LLMMathChain.from_llm(llm=self.llm, verbose=True)
return [
Tool(
name="Search",
func=search.run,
description="useful for when you need to answer questions about current events. You should ask targeted questions",
),
Tool(
name="Calculator",
func=llm_math_chain.run,
description="useful for when you need to answer questions about math",
)
]
def get_agent(self):
agent_kwargs = {
"extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")],
}
memory = ConversationBufferMemory(memory_key="memory", return_messages=True)
agent = initialize_agent(
self.get_tools(),
self.llm,
agent=AgentType.OPENAI_FUNCTIONS,
verbose=True,
agent_kwargs=agent_kwargs,
memory=memory,
)
return agent
def chat_with_bot(self,input_message):
print("agent",self.agent)
response = self.agent.run(input_message)
return response
ใuserใ: hi
ใaiใ: Hello! How can I assist you today?
ใuserใ: my name is bobyang
ใaiใ: Hello Bob! How can I assist you today?
ใuserใ: whats my name?
ใaiใ: I'm sorry, but I don't have access to personal information.
i dont know why?
I fixed it, it was my fault... I shouldn't have put the memory inside a function, that way the memory would be cleared every time. It's better to put the memory as a global variable in the class. I'm so stupid.
Issue you'd like to raise.
`
this is my code,and why the memory is not work?? anyone can help me, thanks!
Suggestion:
the doc of langchain is so sucked!