Closed masterkain closed 1 year ago
facing the same error in aws serverless lambda. Using Serverless TS it gets compiled for deployment.
"Error: Missing tiktoken_bg.wasm", " at /var/task/src/functions/websocket/index.js:51:30647", " at /var/task/src/functions/websocket/index.js:1:222", " at Object.
(/var/task/src/functions/websocket/index.js:51:54331)", " at Module._compile (node:internal/modules/cjs/loader:1196:14)", " at Object.Module._extensions..js (node:internal/modules/cjs/loader:1250:10)", " at Module.load (node:internal/modules/cjs/loader:1074:32)", " at Function.Module._load (node:internal/modules/cjs/loader:909:12)", " at Module.require (node:internal/modules/cjs/loader:1098:19)", " at require (node:internal/modules/cjs/helpers:108:18)", " at _tryRequireFile (file:///var/runtime/index.mjs:912:37)"
Same error on my side
"chatgpt": "^5.2.4",
"next": "13.4.4",
I get the issue with a netlify function.
I'm using nextjs13 + yarnberry, and I experienced this issue and I solved it by using copy-webpack-plugin.
Unplug tiktoken (I'm using yarn berry)
$ yarn unplug tiktoken -R
Add following code to next.config.js
const CopyWebpackPlugin = require("copy-webpack-plugin");
...
webpack(config) {
config.plugins = [
...config.plugins,
new CopyWebpackPlugin({
patterns: [
{
from: ".yarn/unplugged/tiktoken-*/node_modules/tiktoken/lite/tiktoken_bg.wasm",
to: "tiktoken_bg.wasm",
toType: "file",
},
],
}),
];
return config
}
It copies the file to .next/server/chunks at build time That's all.
I found that tiktoken searches tiktoken_bg.wasm file from following candidate paths in next.js13. (from project root) [ ".next/server/chunks/tiktoken_bg.wasm", ".next/server/chunks/node_modules/tiktoken/lite/tiktoken_bg.wasm", ".next/server/node_modules/tiktoken/lite/tiktoken_bg.wasm", ".next/node_modules/tiktoken/lite/tiktoken_bg.wasm", "node_modules/tiktoken/lite/tiktoken_bg.wasm", "../node_modules/tiktoken/lite/tiktoken_bg.wasm","/Users/tatekim/node_modules/tiktoken/lite/tiktoken_bg.wasm", ... "/node_modules/tiktoken/lite/tiktoken_bg.wasm" ] You may copy the file to another places.
@TateKim can you provide how you managed to get the yarn unplug working? As per this issue: https://github.com/vercel/vercel/discussions/4223 there are work arounds but I have been unable to make it work on my end.
See edge-safe examples here:
OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: https://github.com/anthropics/anthropic-tokenizer-typescript/issues/6#issuecomment-1747995284
See edge-safe examples here:
OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: anthropics/anthropic-tokenizer-typescript#6 (comment)
Thanks a lot man. We're working on the next 1b dollar company and you saved our ass.
See edge-safe examples here: OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: anthropics/anthropic-tokenizer-typescript#6 (comment)
Thanks a lot man. We're working on the next 1b dollar company and you saved our ass.
Thanks alot. Finally managed to find a way to count token on steaming. hope this works
If you use js-tiktoken with the serverless framework, make sure to exclude it from the bundle.
Verify latest release
chatgpt
releaseVerify webapp is working
Environment details
Describe the Bug
I'm using chatgpt-api in a next.js 13 route handler, so a HTTP endpoint running in the nodejs runtime, I have two problems:
1) message during development regarding keyv, not imported in my project
2) it doesn't work once the app is transpiled and running in production, calling the endpoint will result in this error. it works in development:
already tried https://socket.dev/npm/package/@dqbd/tiktoken#nextjs with the same result