langchain-ai / langchain

๐Ÿฆœ๐Ÿ”— Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.6k stars 15.31k forks source link

why the memory is not work? who can help me! #14799

Closed tomiezhang closed 10 months ago

tomiezhang commented 10 months ago

Issue you'd like to raise.

`

class SimpleChat:
    def __init__(self) -> None:
        self.llm = ChatOpenAI(
            temperature=0,
            model="gpt-4-0613",
            openai_api_key="sk-",
            openai_api_base = "https://---",
        )
        self.memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

    def get_tools(self):
        return [
            Tool(
                name="Search",
                func = Tooluse().get_google().run,
                description="useful for when you want to search for something on the internet",
            )
        ]

    def get_agent(self):
        conversational_agent = initialize_agent(
            agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, 
            tools=self.get_tools(), 
            llm=self.llm,
            verbose=False,
            memory=self.memory,
        )
        sys_prompt = """You are a chatbot for a Serverless company AntStack and strictly answer the question based on the context below, and if the question can't be answered based on the context, say \"I'm sorry I cannot answer the question, contact connect@antstack.com\""""
        prompt = conversational_agent.agent.create_prompt(
            system_message=sys_prompt,
            tools=self.get_tools(),
        )
        conversational_agent.agent.llm_chain.prompt = prompt
        return conversational_agent

    def chat_with_bot(self,input_message):
        agent = self.get_agent()
        print("agent",agent)
        response = agent.run(input_message)
        return response

this is my code,and why the memory is not work?? anyone can help me, thanks!

Suggestion:

the doc of langchain is so sucked!

dosubot[bot] commented 10 months ago

๐Ÿค–

Hello @tomiezhang ๐Ÿ‘‹,

Nice to meet you! I'm Dosu, a friendly bot here to assist you with your programming woes, answer your queries, and guide you to become a contributor to this repository. While we wait for a human maintainer, feel free to ask me any questions. I'm available round the clock to make your experience smoother! ๐Ÿ˜„

Based on the code you've shared, it seems like you're trying to use the ConversationBufferMemory for storing the conversation history. However, it's not clear from your message what exactly is not working with the memory. Are you getting any error messages? Or is the memory not storing the conversation history as expected?

From the context provided, there are a few similar issues that have been solved in the LangChain repository. One of them is Agent does not recognize chat history (Missing some input keys: {'chat_history'}). The user in this issue was able to resolve it by using the ConversationalChatAgent.create_prompt method instead of the ZeroShotAgent.create_prompt method. Here is the code snippet they used:

prompt = ConversationalChatAgent.create_prompt(
    tools,
    system_message=prefix + format_instructions
)

llm = ChatOpenAI(temperature=0, model="gpt-3.5-turbo")

memory = ConversationBufferMemory(memory_key='chat_history')

agent = initialize_agent(
    tools=tools,
    llm=llm,
    agent_type=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
    verbose=True,
    prompt=prompt,
    memory=memory,
    handle_parsing_errors=True,
    max_iterations=5
)

Another similar issue is How to add conversational memory to pandas toolkit agent?. The user in this discussion was able to add conversational memory to a pandas toolkit agent and shared their solution on StackOverflow.

If these solutions don't help, could you please provide more details about the problem? Specifically, what is the expected behavior and what is the actual behavior? Any error messages would also be helpful.

Regarding your feedback about the LangChain documentation, I understand your frustration. As a maintainer, I'm always looking for ways to improve the documentation and make it more user-friendly. If you have any specific suggestions or areas you think could be improved, please let me know. Your feedback is valuable in helping us improve the LangChain framework.

I hope this helps! If you have any more questions or need further clarification, feel free to ask.

-Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a ๐Ÿ‘ if this is helpful and ๐Ÿ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

keenborder786 commented 10 months ago

You need to pass in the MessagesPlaceholder for the memory variable:

prompt = conversational_agent.agent.create_prompt(
    system_message=sys_prompt,
   extra_prompt_messages=[MessagesPlaceholder("history")],
    tools=self.get_tools(),
)
tomiezhang commented 10 months ago

thank you bro, @keenborder786 but it cant work for me,and i use the https://python.langchain.com/docs/modules/agents/how_to/add_memory_openai_functions demo, its not work too, here is my code:

class SimpleChat:
    def __init__(self) -> None:
        self.llm = ChatOpenAI(
            temperature=0, 
            model="gpt-3.5-turbo-0613",
            openai_api_key="sk-",
            openai_api_base = "https://",
            )
        self.agent = self.get_agent()

    def get_tools(self):
        search = SerpAPIWrapper(serpapi_api_key="")
        llm_math_chain = LLMMathChain.from_llm(llm=self.llm, verbose=True)
        return [
            Tool(
                name="Search",
                func=search.run,
                description="useful for when you need to answer questions about current events. You should ask targeted questions",
            ),
            Tool(
                name="Calculator",
                func=llm_math_chain.run,
                description="useful for when you need to answer questions about math",
            )
        ]

    def get_agent(self):
        agent_kwargs = {
            "extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")],
        }
        memory = ConversationBufferMemory(memory_key="memory", return_messages=True)
        agent = initialize_agent(
            self.get_tools(),
            self.llm,
            agent=AgentType.OPENAI_FUNCTIONS,
            verbose=True,
            agent_kwargs=agent_kwargs,
            memory=memory,
        )
        return agent

    def chat_with_bot(self,input_message):
        print("agent",self.agent)
        response = self.agent.run(input_message)
        return response

and when i use chat_with_bot function,here is the AI response:

ใ€userใ€‘: hi
ใ€aiใ€‘: Hello! How can I assist you today?
ใ€userใ€‘: my name is bobyang
ใ€aiใ€‘: Hello Bob! How can I assist you today?
ใ€userใ€‘: whats my name?
ใ€aiใ€‘: I'm sorry, but I don't have access to personal information.

i dont know why?

tomiezhang commented 10 months ago

I fixed it, it was my fault... I shouldn't have put the memory inside a function, that way the memory would be cleared every time. It's better to put the memory as a global variable in the class. I'm so stupid.