Open zhaochenyang20 opened 10 months ago
I'm not 100% sure whether we actually need to fix this. We could possibly just use tiktoken unless we run into problems.
Oh, actually litellm supports cross-provider token counting, so resolving this may be easier than I anticipated: https://docs.litellm.ai/docs/token_usage
https://github.com/neulab/prompt2model/pull/335#issuecomment-1711578817
Another thing is that I am not sure that all the API-based models use tiktoken to compute input tokens. Thus, the num of input tokens may slightly differ between these models. But that's minimal.