dqbd / tiktokenizer

Online playground for OpenAPI tokenizers
https://tiktokenizer.vercel.app
MIT License
707 stars 88 forks source link

Tweak Request: Don't clear prompt textbox when switching models. #6

Closed JonathanFly closed 1 year ago

JonathanFly commented 1 year ago

9 times out of 10, if I'm switching to a new model, it's specifically to compare and visualize prompt sizes between different encoders and models, on the same prompt. Just a little tweak that would personally save me about 100 copy and pastes every day.

dqbd commented 1 year ago

Hello @JonathanFly!

Sorry for the long delay. Currently, the prompt textbox is only cleared when switching from a chat model to an autocomplete model (eg. gpt-3.5-turbo to cl100k_base), which I believe does make sense?

If I undestand the issue correctly, you are comparing the ChatML prompt messages between different encoders?

dqbd commented 1 year ago

Closing this issue due to inactivity. Feel free to reopen if necessary

PeterDaveHello commented 6 months ago

@dqbd looks different to me, the textbox is always cleared no matter which model I switched, even from gpt-4 to gpt-3.5-turbo

PeterDaveHello commented 6 months ago

Is this the reason why?

https://github.com/dqbd/tiktokenizer/blob/071fb6e059d792aaaf2afbd8e11ece6c9d580840/src/pages/index.tsx#L63-L67