run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.71k stars 5.26k forks source link

Till when we can except the memory in house support in Llama-index Q&A ? #6537

Closed AhmedAffan786 closed 1 year ago

AhmedAffan786 commented 1 year ago

Feature Description

In house chat memory, that require to remember the context of previous chat/conversations. LangChain is providing that why not Llama-index supports with that chat memory.

Reason

Yes, we can integrate with LangChain agents for that but they get to generative, because they are chat agents not Q&A modules. Hope you understand and try to provide the solution.

Value of Feature

No response

logan-markewich commented 1 year ago

We now have data agents and chat engines with basic window buffer memory support. More memory modules are planned for the future (and would love a contribution to them!!)

https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/modules.html

https://gpt-index.readthedocs.io/en/latest/core_modules/query_modules/chat_engines/root.html

Current (default) memory buffer: https://github.com/jerryjliu/llama_index/blob/main/llama_index/memory/chat_memory_buffer.py