mortium91 / langchain-assistant

Interact with LLM's (GPT-3, GPT-3.5, GPT-4) via messenger apps like Telegram, Whatsapp and Facebook Messenger
https://abracadata.io
MIT License
155 stars 33 forks source link

How can we maintain memory or state of chatbot to remember context? #16

Open SaadAhmed96 opened 1 year ago

SaadAhmed96 commented 1 year ago

I have made a simple chatbot connected to WhatsApp using Twilio that uses langchain to respond to queries.

The problem is I am unable to maintain the state. I looked at the source code of this repo and here the memory is stored to the disk and then loaded as context.

Is there any other way of doing it?

Thanks.

scientist1642 commented 1 year ago

Hi, what do you mean by unable to maintain the state? Can you describe a scenario?

SaadAhmed96 commented 1 year ago

yeah so for example I asked my bot the following question

Me: How are you my name is Saad.
Bot: I am good. thanks for asking Saad.

Me: Do you remember my name?
Bot: No I don't remember your name.

I haven't exactly deployed your code for my bot, I was only looking at it for inspiration and see how I can maintain the history of conversation so that the bot is more context aware.

This is my code

# Third-party imports
from fastapi import FastAPI, Form, Request
import os
from utils import context_template

#LangChain Imports
from langchain.memory import ConversationBufferMemory
from langchain import OpenAI, LLMChain, PromptTemplate

from twilio.rest import Client

from dotenv import load_dotenv

load_dotenv()

app = FastAPI()

@app.post("/message")
async def reply(request: Request):
    form_data = await request.form()
    print(form_data)

    from_number = form_data['From']
    print(from_number)

    Body = form_data["Body"]

    # Get the Twilio account SID, auth token, and WhatsApp number from environment variables
    openai_api_secret_key = os.getenv("OPENAI_API_KEY")
    account_sid = os.getenv('TWILIO_ACCOUNT_SID')
    auth_token = os.getenv('TWILIO_AUTH_TOKEN')
    whatsapp_number = os.getenv('TWILIO_NUMBER')

    client = Client(account_sid, auth_token)

    #Use langchain Instead of OpenAI
    prompt = PromptTemplate(input_variables=["chat_history", "human_input"], template=context_template())
    memory = ConversationBufferMemory(memory_key="chat_history")

    llm_chain = LLMChain(llm=OpenAI(openai_api_key = openai_api_secret_key), prompt=prompt, verbose=True, memory=memory)

    # The generated text
    chat_response = llm_chain.predict(human_input=Body)
    print(chat_response)

    message = client.messages.create(
            from_=f"whatsapp:{whatsapp_number}",
            body=chat_response,
            to=from_number
    )

    return ""

Sorry long code snippet but if you have any idea please let me know.

scientist1642 commented 1 year ago

That's because in your snippet memory and llm_chain is created newly every time. Try moving definition of prompt memory and llm_chain after app = FastApi()

SaadAhmed96 commented 1 year ago

Ahoy! Thanks, man it worked like a charm.

SaadAhmed96 commented 1 year ago

Another question

If i msg from another number it still uses the context from the previous number I was messaging from. Any suggestions how to resolve it>

scientist1642 commented 1 year ago

Yes I was just typing that. it's only ok if you are using it for one user, otherwise you need to store them separately in a dict, database or something based on from_number

SaadAhmed96 commented 1 year ago

Yeah, I am thinking about using Twilio's conversation API or making my custom logic to store and retrieve them.