I was happy to see you use it, but then I realized you are creating this on top of OpenAI's API. OpenAI uses different tokenization than LLaMA, so the token counts will not be the same. I think you need to use a different library, such as https://github.com/niieani/gpt-tokenizer
Hey, I happened to notice you started using my tokenizer: https://github.com/belladoreai/llama-tokenizer-js
I was happy to see you use it, but then I realized you are creating this on top of OpenAI's API. OpenAI uses different tokenization than LLaMA, so the token counts will not be the same. I think you need to use a different library, such as https://github.com/niieani/gpt-tokenizer