Cainier / gpt-tokens

Calculate the token consumption and amount of openai gpt message
MIT License
104 stars 13 forks source link

messages token count algorithm is different from the one in openAI official cookbook #16

Closed bofeng closed 1 year ago

bofeng commented 1 year ago

On this line: https://github.com/Cainier/gpt-tokens/blob/main/index.js#L170

For model "gpt-3.5-turbo-0613", at line 170, the code is using tokens_per_message = 4, tokens_per_name = -1

but in openAI's cookbook code: https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb , in the num_tokens_from_messages function (snippet IN [14]) , you can see it is using tokens_per_message = 3, tokens_per_name = 1

And from their code, gpt-3.5-turbo-0301 and gpt-3.5-turbo-0613 are using different token-count method, but code in this repo are using the same method: https://github.com/Cainier/gpt-tokens/blob/main/index.js#L164

Cainier commented 1 year ago

Thanks, you are right, I just checked openAI's cookbook code [14] and fixed this issue in code v1.0.10