langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.33k stars 14.76k forks source link

Chatbot memory integration #3977

Closed zigax1 closed 11 months ago

zigax1 commented 1 year ago

Hello everyone!

I am using Langchain and I want to implement chatbot memory. I am doing everything according to the docs and my bot doesn't remember anything I tell him.

Code snippet:

llm = ChatOpenAI(model_name='gpt-3.5-turbo', temperature=0.3, openai_api_key=OPENAI_API_KEY)

memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
rqa = ConversationalRetrievalChain.from_llm(llm, docsearch.as_retriever(), memory=memory)

def retrieve_answer(query, chat_history):
    memory.chat_memory.add_user_message(query)
    res = rqa({"question": query})
    retrieval_result = res["answer"]

    if "The given context does not provide" in retrieval_result:
        base_result = llm.generate([query])
        return base_result.generations[0][0].text
    else:
        return retrieval_result

messages = []

print("Welcome to the chatbot. Enter 'quit' to exit the program.")
while True:
    user_message = input("You: ")
    answer = retrieve_answer(user_message, messages)
    print("Assistant:", answer)

    messages.append((user_message, answer))

Whole python script is located here: https://github.com/zigax1/chat-with-pdf.git

Does anyone have any idea, what am I doing wrong?

Thanks to everyone for help.

image

rchatham commented 1 year ago

Also seeing the same. The ConversationalMemoryBuffer is not giving the bot access to previous messages.

bconrad98 commented 1 year ago

Yes I think memory is broken specifically for the ChatOpenAI model. The OpenAI LLM seems to do fine

shrshk commented 1 year ago

I have this same issue I think and I'm using 'from langchain import OpenAI'

dosubot[bot] commented 1 year ago

Hi, @zigax1! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you were trying to implement chatbot memory using LangChain, but it doesn't seem to be working as expected. Other users have also reported the same issue, specifically with the ChatOpenAI model. One user mentioned using 'from langchain import OpenAI' and experiencing the same problem.

If this issue is still relevant to the latest version of the LangChain repository, please let the LangChain team know by commenting on this issue. Otherwise, feel free to close the issue yourself. If there is no response within 7 days, the issue will be automatically closed.

Thank you for your understanding and contribution to the LangChain community!