Closed EiffelFly closed 11 months ago
same here
back to this repo: https://github.com/niieani/gpt-tokenizer
Hi! This does seem to be a regression caused by 1.0.7, will address this issue in later versions.
In the meantime, js-tiktoken
is a pure JS version of the package, which should be more suitable for edge runtimes such as Cloudflare Workers.
@dqbd I can confirm js-toktoken is working correctly in edge-runtime due to langchainjs use this package under the hood!! Thanks for the head up.
Thanks @dqbd ! I was a bit confused as well since the readme still points towards using the wasm version
Is the plan to maintain both wasm and JS going forward?
Yep! Both of the libraries have their advantages and disadvantages and the plan is to support both for the foreseeable future.
@dqbd I'm having the same issue on Vercel (edge function). Since you've closed the issue, what was the resolution of the problem? Migrate to js-tiktoken I guess? ;-)
Still running into this error using Vercel edge functions. Any updates here? Using js-tiktoken seems to make the edge function size >1MB
Hi, thanks for building this tool! I had encountered a problem when I tried to run this repo in cloudflare worker.
you could go to this repo to reproduce repo
pnpm i
at the root folderpnpm dev
turn on local mode
in the terminalThen you will see the issue happening.
It is quite hard to find the solution on the web, so I am curious about how do you test the package on Cloudflare worker and make it work.