Closed BDuba closed 1 year ago
There is no way to feed the OpenAI model a message that is larger than the limit. If you split the message into two, the model will just treat them as two unrelated messages.
However, the bot should try to shorten long input messages before sending them to OpenAI. If you get this error, the truncation feature may not be working as expected.
A few questions:
/version
command.Fixed in v104:
If the text of the link is too long, the bot gives the typical answer: "Failed to answer. Reason: openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 5651 tokens. Please reduce the length of the messages." Is it possible to automatically split the text into several parts, no longer than 4097 tokens for English and no more than 3000 tokens for Russian?