deep-foundation / russian-laws-bot

The Unlicense
2 stars 3 forks source link

Context limit does not work #3

Open Konard opened 9 months ago

Konard commented 9 months ago
2024-02-19 10:07:18,498 - __main__ - ERROR - OpenAI completion error: This model's maximum context length is 128000 tokens. However, your messages resulted in 130444 tokens. Please reduce the length of the messages.
2024-02-19 10:07:18,499 - __main__ - ERROR - This model's maximum context length is 128000 tokens. However, your messages resulted in 130444 tokens. Please reduce the length of the messages.
FreePhoenix888 commented 9 months ago

What is the reason? https://github.com/deep-foundation/russian-laws-bot/blob/149a8f351144e7b2e132da03877303938973b677/main.py#L25-L26 https://github.com/deep-foundation/russian-laws-bot/blob/149a8f351144e7b2e132da03877303938973b677/main.py#L214-L217

Konard commented 9 months ago

@FreePhoenix888 calculation is wrong, because now only single message content tokens number is calculated. To fix that bug, we have to calculate tokens of the entire context (all messages that will be sent to GPT-4). It will require logic similar to what we have in our ChatGPT package in Deep. Or we can just ignore that issue if we will use actual ChatGPT deep package.

As a workaround of that and other issues, I now send only single message to GPT-4 API.

FreePhoenix888 commented 9 months ago

@FreePhoenix888 calculation is wrong, because now only single message content tokens number is calculated. To fix that bug, we have to calculate tokens of the entire context (all messages that will be sent to GPT-4). It will require logic similar to what we have in our ChatGPT package in Deep. Or we can just ignore that issue if we will use actual ChatGPT deep package.

As a workaround of that and other issues, I now send only single message to GPT-4 API.

Should we prefer chatgpt deep package ?