Open clairefro opened 3 months ago
I checked this and found that it was because the package js-tiktoken did not support the gpt-4o
series model
I made this series type compatible in the code in v1.3.8
// ...
try {
let jsTikTokenSupportModel: TiktokenModel
if (model === 'gpt-4o-mini' || model === 'gpt-4o-mini-2024-07-18') {
jsTikTokenSupportModel = 'gpt-4o'
} else {
jsTikTokenSupportModel = model
}
modelEncodingCache[model] = encodingForModel(jsTikTokenSupportModel)
} catch (e) {
console.error('Model not found. Using cl100k_base encoding.')
modelEncodingCache[model] = getEncoding('cl100k_base')
}
// ...
Thanks for the quick fix - can confirm this resolved the issue!
Didn't realize all models in the series used the same tokenization algo
I was using
gpt-tokens
to calculategpt-4o
tokens flawlessly. I recently updated to1.3.7
to use withgpt-4o-mini
model, but it seems not to recognize the model in the browser:usage
config:
I also tried uninstalling/reinstalling all node_modules to ensure the latest
gpt-tokens
(1.3.7) was being used and not some cached older version.I tried reproducing with just node, but seems to work fine there with model
gpt-4o-mini
. https://replit.com/@clairefro/gpt-tokens-test-nodeis there a reason it would work in node but not the browser? worked fine in the browser for gpt-4o