zurawiki / tiktoken-rs

Ready-made tokenizer library for working with GPT and tiktoken
MIT License
240 stars 47 forks source link

async-openai 0.18 support #56

Open ChristopherJMiller opened 8 months ago

ChristopherJMiller commented 8 months ago

OpenAI's spec introduced some new capabilities that led to some larger changes in the async-openai crate. I'm more than happy to contribute this work to tiktoken-rs but I wanted to open the dialogue first about one of the larger structural changes:

The biggest thing (imo) is the structure of ChatCompletionRequestMessage was changed to support a different data type per role.

This makes getting message content less trivial, but seemed like a necessity since OpenAI now supports user messages including images when invoking certain models, so this will require a data structure change. I'm not very familiar with this space outside of general application, so looking for input in terms of how these new user messages should be handled in terms of token counting.

Dreaming-Codes commented 7 months ago

This discussion was already started in #50