Closed jiadingfang closed 1 month ago
@jiadingfang I suspect it's same as gpt-4o has
I am curious about this as well.
I'm also curious about it
I'm also very curious about this issue. It seems to be basically the same as the gpt-4o model, but it seems to me that it includes some other tokens.
Curious +1
Need this support!
Why tiktoken.encodingForModel("gpt-4o-mini")
still doesn't work?
Error: No tokenizer found for model gpt-4o-mini
I see different input token counts between o1-preview and gpt-4o. Probably a different tokenizer. Update: I take it back, as probably behind the scenes a different system prompt is set, and counts as input.
This is fixed in tiktoken 0.8. Yes, from user's perspective the o1 tokeniser works the same way as 4o tokeniser
Hi openai devs,
Congrats on releasing o1 models, and I'm excited to test them out. Since there seems to be new "reasoning" tokens in generation, I wonder if there is a new version for tiktoken for o1 models?
Thanks in advance!