josephrocca / OpenCharacters

Simple little web interface for creating characters and chatting with them. It's basically a single HTML file - no server. Share characters using a link (character data is stored within the URL itself). All chat data is stored in your browser using IndexedDB. Currently supports OpenAI APIs and ~any Hugging Face model.
https://josephrocca.github.io/OpenCharacters
MIT License
375 stars 64 forks source link

Recurring context_length_exceeded errors after long conversation #19

Closed perkopoulos closed 1 year ago

perkopoulos commented 1 year ago

Running tonight directly from https://josephrocca.github.io/OpenCharacters/#, I created a couple of new characters, and eventually had a very long chat with one (perhaps 200-300 message pairs? More? Hard to say). All was well, but suddenly I start getting the attached context_length_exceeded error. The length is exceeding 4097 by a very small number of tokens. The error repeats with each submission from this point on, although the token counts vary slightly. I tried deleting and resubmitting my last input, and many retries, but all now fail the same way. Switching characters to a new instance of the same AI works fine, then fails again when I switch back to the long chat. So my long chat is basically over now as a result.

The last input that first triggered this is nothing special, and is in fact rather short. So I think there's something off in your max context length calculation. I saved the thread at that point and can provide that and/or full screenshot or bot name if required.

ContextLengthError

josephrocca commented 1 year ago

Sorry about this - going to try fix it today. Token counts are currently an estimate, so I add a buffer, and it looks like the buffer wasn't big enough here. I've just pushed an update which increases the buffer as a temporary fix - should hopefully give you your thread back in the meantime (refresh page to get the update).

Someone on the server got a much bigger token overrun, which suggests an actual bug in context length calculation stuff, so if you get this again and it's significantly over (e.g. 4500+), please do post another comment here, and share the thread json if possible (you can DM me rocca#4242 on discord)

josephrocca commented 1 year ago

@perkopoulos actually, can you share that thread with me on Discord, or via some other means? I'm thinking this is actually a bug and not just due to estimate being off by a bit

perkopoulos commented 1 year ago

@josephrocca I DM'ed you the thread on Discord. Your fix has already gotten me past where I was stuck yesterday, so thanks. Although there might be a deeper problem. I thought it could be related to all the accumulating summaries at the bottom as the thread gets long. But that was just a wild guess.

josephrocca commented 1 year ago

Fixed, as discussed on Discord - thanks again!