Closed behrica closed 1 year ago
I had an idea on the conversational memory in functional terms, which is maybe worth to discuss.
A "conversation" could be seen as various calls to gen/complete-template
where in every call
some from of user supplied call back fn can be triggered.
This callback-fn could serve 2 purposes:
I am not sure yet, how to express this with means of Clojure
The "example of this" is a user supplied call back fn, which accumulates previous answers in the context up to a certain length (the max allowed tokens), from which onwards it would need to "summarize" via an extra call to a LLM
I am on vacation. Back next week and happy to discuss.
I have moved out conversation mode
part to this issue #25
Closing this with the latest release supporting memory handling. See https://zmedelis.github.io/bosquet/notebook/examples/short_memory_prosocial_dialog/index.html will be adding long term memory documentation soon. This is a complex topic and memory handling will evolve
to continue from #22 regarding memory:
After I have read here: https://archive.pinecone.io/learn/langchain-conversational-memory/
I see know that there are conceptually two very different types of memory.
The article above talks about "conversational memory", so a memory which lives only during a longer lasting conversation between a person and a LLM.
What I have tried is more about giving an LLM access to a "long term persistent memory"
gen/complete-template
ones for each snippet foundgen/complete-template
which takes in results of above and and other instructions to summarize this (indeed, paying attention of not passing token limit)Both might / might not rely on vector databases.
Both have in common that
gen/complete-template
is called several times: