Micro AI for multiple LLM switching, preparing datasets, training models, and deploying them in isolated environments using Docker
GNU General Public License v3.0
4
stars
6
forks
source link
Add ability to add chat session history to be added to next response call to LLM for conversation memory #16
Closed
lxy009 closed 3 months ago
We will need user session management to do this as I don't think many, if any, of the big LLM providers have this out of the box.
This might require some level of abstraction to work with each different type of API but the general data structures are about the same.
default limits will be based on LLM provider.
out-of-scope: summarizing previous history when reaching limits. we can use a simple window and cutoff for now.