rajatkulkarni95 / yack

Mac OS Spotlight like app as a ChatGPT Interface!
https://yack.fyi
MIT License
25 stars 1 forks source link

After having a too long conversation you start exceeding the token limit #20

Open TinQ0 opened 7 months ago

TinQ0 commented 7 months ago

I don't know whether yack shorts the answers to fit within the token requirements of certain models, however after having a long continuous discussion (think 7+ long questions and/or answers) chatgpt will return an error stating the token limit is exceeded.

rajatkulkarni95 commented 7 months ago

So it doesn't strip earlier conversations when sending them for "context".

It'd probably be helpful to maybe restrict the last 10-15 chat messages when sending the new message for context, so that token limits don't get exhausted.

Probably just have to do an array.slice when sending message conversations to the stream