Closed AhmedAffan786 closed 1 year ago
We now have data agents and chat engines with basic window buffer memory support. More memory modules are planned for the future (and would love a contribution to them!!)
https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/modules.html
https://gpt-index.readthedocs.io/en/latest/core_modules/query_modules/chat_engines/root.html
Current (default) memory buffer: https://github.com/jerryjliu/llama_index/blob/main/llama_index/memory/chat_memory_buffer.py
Feature Description
In house chat memory, that require to remember the context of previous chat/conversations. LangChain is providing that why not Llama-index supports with that chat memory.
Reason
Yes, we can integrate with LangChain agents for that but they get to generative, because they are chat agents not Q&A modules. Hope you understand and try to provide the solution.
Value of Feature
No response