dqbd / tiktoken

JS port and JS/WASM bindings for openai/tiktoken
MIT License
648 stars 49 forks source link

How to count tokens for functions defined in the prompt? #78

Open vrde opened 8 months ago

vrde commented 8 months ago

I use functions in addition to a custom prompt, and a history of messages, to my chatCompletion calls.

I need to calculate the amount of tokens in my calls, to avoid hitting the token limit of the model.

I know how to calculate the amount of tokens for my prompt, but how do I to do it for the functions? Should I just tokenize the JSON describing the functions?

waptik commented 8 months ago

You can consider checking out this package