Closed raufbisov closed 1 month ago
I think this is caused because I haven't set a limit for the context of the chat. So, every time you ask something, all the previous messages will be sent to the API. ChatGPT has a limited context of 4096 tokens, so exceeding that limit will result in an error. This issue will be fixed, but until 7 July, I am occupied with exams.
+1 to this request. When working through some complex conversations that build on each other, that context is really important.
I'm really enjoying this extension so far. 😄
Attached response error and related issue #30
{
"error": {
"message": "This model's maximum context length is 4097 tokens. However, you requested 4610 tokens (1610 in the messages, 3000 in the completion). Please reduce the length of the messages or completion.",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
}
After 15 minutes of using ChatGPT it responds with "Error: HTTP error! status: 400" There is a workaround where you disable the context feature (the green button in the upper right). Still, it would be nice if I could keep this feature.