aws-samples / generative-ai-amazon-bedrock-langchain-agent-example

MIT No Attribution
215 stars 380 forks source link

Remembering the chat history #17

Closed alexsambacanada closed 3 months ago

alexsambacanada commented 4 months ago

Hey Kyle!

Loving this project! Hey question for you:

In the chat.py we have this:

def __init__(self, event):
        print("Initializing Chat with FSI Agent")
        self.set_user_id(event)
        self.set_chat_index()
        self.set_memory(event)
        self.create_new_chat()

If I'm understanding this correctly:

  1. We get the User ID
  2. We get the index # from the index ddb table (ie. '0')
  3. We use this index # and User ID to create a session id which we then use within Langchain
  4. We finally update the index # in the index table.

I'd like subsequent invocations of the lambda to recall the existing history (ie. stay on the same index). So that if I ask follow up question to the LLM it remembers the history.

Before I start making changes to the file, I wanted to confirm my understanding is correct or if I'm going about this incorrectly. I'm pretty new with LLMs and LangChain but what I'm doing seems to me it would be the entire intent of having "memory"; which makes me feel I'm missing something.

Thanks!

kyleblocksom commented 3 months ago

See commit.