Currently only the last chat message with the user question is fed to the chat model. However, the complete chat history is sent to the server. We should use LangChain's Memory component to integrate the previous messages in the model invocation, with a limit set.
Tasks
[ ] Add BufferMemory with the last 5 messages to the existing chain in /chat endpoint
[ ] Extract the 5 message window value in constants.ts
Currently only the last chat message with the user question is fed to the chat model. However, the complete chat history is sent to the server. We should use LangChain's Memory component to integrate the previous messages in the model invocation, with a limit set.
Tasks
5
message window value inconstants.ts