Closed portkeys closed 1 year ago
Same question. Thanks!
Same.
@portkeys seems doable using llama_index
+ tooling from langchain
:
memory = ConversationBufferMemory(memory_key="chat_history")
llm=OpenAI(temperature=0)
agent_chain = create_llama_chat_agent(
toolkit,
llm,
memory=memory,
verbose=True
)
source: https://github.com/jerryjliu/llama_index/blob/main/examples/chatbot/Chatbot_SEC.ipynb
Thanks for sharing this solution! @phiweger
Hi, @portkeys! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you were asking if it is possible to add a memory
argument to the ChatVectorDBChain
in the Langchain library. You also provided some code and asked if it is the correct way to use it. There have been a few comments from other users discussing related issues and providing a potential solution using llama_index
and langchain
. You thanked one user for sharing this solution.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project! Let us know if you have any further questions or concerns.
Love Langchain library, so obsessed with it lately!
I've using ChatVectorDBChain which retrieves answers from Pinecone vectorstore and it's been working very well.
But one thing I noticed is that for normal
ConversationChain
, you can addmemory
argument, which provides nice user experience because it remembers the discussed entities.Question: can we add
memory
argument toChatVectorDBChain
? If it is already existed, could you point me whether below code is the right way to use it?Thanks again so much!!😊