Cainier / gpt-tokens

Calculate the token consumption and amount of openai gpt message
MIT License
106 stars 13 forks source link

Cache encodings in module #26

Closed lox closed 1 year ago

lox commented 1 year ago

Fixes https://github.com/Cainier/gpt-tokens/issues/25

Testing performance
GPTTokens: 127.178ms
GPTTokens: 0.278ms
GPTTokens: 0.129ms
GPTTokens: 0.137ms
GPTTokens: 0.065ms
GPTTokens: 0.065ms
GPTTokens: 0.114ms
GPTTokens: 0.055ms
GPTTokens: 0.049ms
GPTTokens: 0.118ms