niieani / gpt-tokenizer

JavaScript BPE Tokenizer Encoder Decoder for OpenAI's GPT-2 / GPT-3 / GPT-4 / GPT-4o. Port of OpenAI's tiktoken with additional features.
https://gpt-tokenizer.dev
MIT License
424 stars 35 forks source link

Support function calls? #21

Open seoker opened 1 year ago

seoker commented 1 year ago

As you may know, function calls are now supported by OpenAI, and the function call tokens will be taken into account. With some googling, I found the calculation here.

It will be great if the library could also calculate the required tokens when using with function calls. 🙏🏼

chuanqisun commented 1 year ago

One related issue is the type signature of ChatMessage. Depending on whether there is functional calling, the ChatMessage may have either the content field or the function_call field, but not both. The current typing in gpt-tokenizer/src/GptEncoding.ts will need an update.

NatoBoram commented 6 months ago

I saw that https://github.com/hmarr/openai-chat-tokens already does it - would it be fine to import that package here to provide that functionality?

niieani commented 1 month ago

PRs welcome! I would prefer not to pull in an external dependency, and it looks like the code isn't too complex.