nalgeon / pokitoki

Humble GPT Telegram Bot
MIT License
317 stars 56 forks source link

Message token count exceeds the maximum allowed token count #12

Closed BDuba closed 1 year ago

BDuba commented 1 year ago

If the text of the link is too long, the bot gives the typical answer: "Failed to answer. Reason: openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 5651 tokens. Please reduce the length of the messages." Is it possible to automatically split the text into several parts, no longer than 4097 tokens for English and no more than 3000 tokens for Russian?

nalgeon commented 1 year ago

There is no way to feed the OpenAI model a message that is larger than the limit. If you split the message into two, the model will just treat them as two unrelated messages.

However, the bot should try to shorten long input messages before sending them to OpenAI. If you get this error, the truncation feature may not be working as expected.

A few questions:

  1. Please provide the output of the /version command.
  2. Please provide the input message to reproduce the error.
alekseysotnikov commented 1 year ago
  1. version: 93
  2. read https://en.wikipedia.org/wiki/Dog
nalgeon commented 1 year ago

Fixed in v104: