n3d1117 / chatgpt-telegram-bot

🤖 A Telegram bot that integrates with OpenAI's official ChatGPT APIs to provide answers, written in Python
GNU General Public License v2.0
3.01k stars 1.39k forks source link

Wrong conversation summary size vs MAX_TOKENS #610

Open delfer opened 3 months ago

delfer commented 3 months ago

Config

OPENAI_MODEL=gpt-3.5-turbo-0125
MAX_TOKENS=16385

Output

2024-07-03 14:39:30,037 - root - INFO - Chat history for chat ID 352569383 is too long. Summarising...
2024-07-03 14:39:34,054 - root - ERROR - This endpoint's maximum context length is 16385 tokens. However, you requested about 17537 tokens (39 of text input, 1113 of tool input, 16385 in the output). Please reduce the length of either one.

Problem: when history too long it summarised to MAX_TOKENS. End then sends to model with MAX_TOKENS+input+tool_input tokens whith is more than MAX_TOKENS

delfer commented 3 months ago

fixed in https://github.com/n3d1117/chatgpt-telegram-bot/pull/614