Open Konard opened 9 months ago
@FreePhoenix888 calculation is wrong, because now only single message content tokens number is calculated. To fix that bug, we have to calculate tokens of the entire context (all messages that will be sent to GPT-4). It will require logic similar to what we have in our ChatGPT package in Deep. Or we can just ignore that issue if we will use actual ChatGPT deep package.
As a workaround of that and other issues, I now send only single message to GPT-4 API.
@FreePhoenix888 calculation is wrong, because now only single message content tokens number is calculated. To fix that bug, we have to calculate tokens of the entire context (all messages that will be sent to GPT-4). It will require logic similar to what we have in our ChatGPT package in Deep. Or we can just ignore that issue if we will use actual ChatGPT deep package.
As a workaround of that and other issues, I now send only single message to GPT-4 API.
Should we prefer chatgpt deep package ?