Currently, the bot can change it's capacity based on the User Preference. So, instead of tossing (dequeuing) a message and throwing it away, we can Grab necessary information from the chat history and keep it as "necessary" information for the bot to know.
For example, if we prompted to bot to act like a pirate, eventually it will lose the memory of "acting as a pirate".
Essentially we want to have this Patient 0 of messages that will hold "extra" context that it should always remember for a conversation!
Questions to answer
What are the key facts I should keep about this message?
Out of the message, what is the most important? (asking an LLM)
Does this go in a second queue? A different private variable in queue.ts?
Do we use a Priority queue or sort when asking a model to prioritize facts in a message.
// queue.ts
...
export class Queue<T> implements IQueue<T> {
private storage: T[] = []
private necessaryContext = [] // **list of necessary information for the bot to have if something gets dequeued
/**
* Set up Queue
* @param capacity max length of queue
*/
constructor(public capacity: number = 5) {}
...
We could either have a list of "necessary context" or maybe just 1 and we keep appending to it.
Another idea is to keep it as it's own class to handle this.
Essentially when we dequeue something from the storage, we want to run it through some general model to pick out what is relevant and necessary from the context we are removing from the history.
// messageCreate.ts
import { ChatResponse } from 'ollama'
...
let response: ChatResponse
// check if we can push, if not, remove oldest
if (msgHist.size() === msgHist.capacity) msgHist.dequeue()
// push user response before ollama query
msgHist.enqueue({
role: 'user',
content: message.content
})
...
The code above still needs to be modified to handle capacity change from #39
Essentially in the while loop that is to be implemented, it would run through and dequeue, then find the best relevancy from it or none at all based on the current context already.
Idea
Questions to answer
queue.ts
?storage
, we want to run it through some general model to pick out what is relevant and necessary from the context we are removing from the history.