Open Vansick1 opened 10 months ago
This is something that could be added. Let me think about it on how to best do this.
One thing I'm noticing is that the memory buffer eventually overflows and the program exits. I'm experimenting with saving the chat history to a text file and ingesting it after each display of the answer, but I don't know how to let the model gradually forget tokens to avoid the aforementioned buffer crash.
Perhaps https://github.com/gmickel/memorybot could serve as a base? It's in Typescript, but I'm fairly certain that Langchain's Python version has the same functions.
It would be useful if the chat history could be passed in with the query. This way it would be possible to implement a stateless web api, and let the entire chat history be passed back and forth between client and server.
I'm noticing that chat history only persists during runtime - history is lost upon closing the program. Is there a way to implement persistent data storage for that?