neulab / prompt2model

prompt2model - Generate Deployable Models from Natural Language Instructions
Apache License 2.0
1.93k stars 167 forks source link

tiktoken usage #341

Open zhaochenyang20 opened 10 months ago

zhaochenyang20 commented 10 months ago

https://github.com/neulab/prompt2model/pull/335#issuecomment-1711578817

Another thing is that I am not sure that all the API-based models use tiktoken to compute input tokens. Thus, the num of input tokens may slightly differ between these models. But that's minimal.

neubig commented 10 months ago

I'm not 100% sure whether we actually need to fix this. We could possibly just use tiktoken unless we run into problems.

neubig commented 10 months ago

Oh, actually litellm supports cross-provider token counting, so resolving this may be easier than I anticipated: https://docs.litellm.ai/docs/token_usage