Open TinQ0 opened 7 months ago
So it doesn't strip earlier conversations when sending them for "context".
It'd probably be helpful to maybe restrict the last 10-15 chat messages when sending the new message for context, so that token limits don't get exhausted.
Probably just have to do an array.slice when sending message conversations to the stream
I don't know whether yack shorts the answers to fit within the token requirements of certain models, however after having a long continuous discussion (think 7+ long questions and/or answers) chatgpt will return an error stating the token limit is exceeded.