As far as I understand simpleaichat just sends the earlier conversation along in order to support chat sessions. This approach will exceed the model's token limit quickly in cases, where the original context has already been substantial.
I don't think there is a way to have simpleaichat to compress the earlier conversation, I'll have to implement that outside of simpleaichat. Or did I miss something?
Is anybody aware of a great library that does that?
As far as I understand simpleaichat just sends the earlier conversation along in order to support chat sessions. This approach will exceed the model's token limit quickly in cases, where the original context has already been substantial.
I don't think there is a way to have simpleaichat to compress the earlier conversation, I'll have to implement that outside of simpleaichat. Or did I miss something?
Is anybody aware of a great library that does that?
Thanks!