This model's maximum context length is 4097 tokens, however you requested 4396 tokens (3896 in your prompt; 500 for the completion). Please reduce your prompt; or completion length.
since it stores 10 history entries by default, conversation gets often stuck at some point
should probably clean some old messages to stay within limits
I'm receiving the following error:
since it stores 10 history entries by default, conversation gets often stuck at some point
should probably clean some old messages to stay within limits