transitive-bullshit / agentic

AI agent stdlib that works with any LLM and TypeScript AI SDK.
https://agentic.so
MIT License
16.38k stars 2.14k forks source link

Error: Missing tiktoken_bg.wasm issue with next.js 13 #570

Closed masterkain closed 1 year ago

masterkain commented 1 year ago

Verify latest release

Verify webapp is working

Environment details

"next": "13.4.2"
"chatgpt": "^5.2.4"

Describe the Bug

I'm using chatgpt-api in a next.js 13 route handler, so a HTTP endpoint running in the nodejs runtime, I have two problems:

1) message during development regarding keyv, not imported in my project

- wait compiling /api/ai/route (client and server)...
- warn ./node_modules/keyv/src/index.js
Critical dependency: the request of a dependency is an expression
Import trace for requested module:
./node_modules/keyv/src/index.js
./node_modules/chatgpt/build/index.js
./app/api/ai/route.ts

2) it doesn't work once the app is transpiled and running in production, calling the endpoint will result in this error. it works in development:

Error: Missing tiktoken_bg.wasm
    at 3590 (/app/.next/server/chunks/75.js:314:26)
    at __webpack_require__ (/app/.next/server/webpack-runtime.js:25:43)
    at 6136 (/app/.next/server/chunks/75.js:1277:16)
    at __webpack_require__ (/app/.next/server/webpack-runtime.js:25:43)
    at 4242 (/app/.next/server/app/api/ai/route.js:129:13)
    at __webpack_require__ (/app/.next/server/webpack-runtime.js:25:43)
    at __webpack_exec__ (/app/.next/server/app/api/ai/route.js:267:39)
    at /app/.next/server/app/api/ai/route.js:268:81
    at __webpack_require__.X (/app/.next/server/webpack-runtime.js:150:21)
    at /app/.next/server/app/api/ai/route.js:268:47

already tried https://socket.dev/npm/package/@dqbd/tiktoken#nextjs with the same result

iam-dev0 commented 1 year ago

facing the same error in aws serverless lambda. Using Serverless TS it gets compiled for deployment.

"Error: Missing tiktoken_bg.wasm", " at /var/task/src/functions/websocket/index.js:51:30647", " at /var/task/src/functions/websocket/index.js:1:222", " at Object. (/var/task/src/functions/websocket/index.js:51:54331)", " at Module._compile (node:internal/modules/cjs/loader:1196:14)", " at Object.Module._extensions..js (node:internal/modules/cjs/loader:1250:10)", " at Module.load (node:internal/modules/cjs/loader:1074:32)", " at Function.Module._load (node:internal/modules/cjs/loader:909:12)", " at Module.require (node:internal/modules/cjs/loader:1098:19)", " at require (node:internal/modules/cjs/helpers:108:18)", " at _tryRequireFile (file:///var/runtime/index.mjs:912:37)"

lionel95200x commented 1 year ago

Same error on my side

   "chatgpt": "^5.2.4",
    "next": "13.4.4",
BLipscomb commented 1 year ago

I get the issue with a netlify function.

TateKim commented 1 year ago

I'm using nextjs13 + yarnberry, and I experienced this issue and I solved it by using copy-webpack-plugin.

  1. Unplug tiktoken (I'm using yarn berry)

    $ yarn unplug tiktoken -R
  2. Add following code to next.config.js

    const CopyWebpackPlugin = require("copy-webpack-plugin");
    ...
    webpack(config) {
    config.plugins = [
      ...config.plugins,
      new CopyWebpackPlugin({
        patterns: [
          {
            from: ".yarn/unplugged/tiktoken-*/node_modules/tiktoken/lite/tiktoken_bg.wasm",
            to: "tiktoken_bg.wasm",
            toType: "file",
          },
        ],
      }),
    ];
    return config
    }

    It copies the file to .next/server/chunks at build time That's all.

I found that tiktoken searches tiktoken_bg.wasm file from following candidate paths in next.js13. (from project root) [ ".next/server/chunks/tiktoken_bg.wasm", ".next/server/chunks/node_modules/tiktoken/lite/tiktoken_bg.wasm", ".next/server/node_modules/tiktoken/lite/tiktoken_bg.wasm", ".next/node_modules/tiktoken/lite/tiktoken_bg.wasm", "node_modules/tiktoken/lite/tiktoken_bg.wasm", "../node_modules/tiktoken/lite/tiktoken_bg.wasm","/Users/tatekim/node_modules/tiktoken/lite/tiktoken_bg.wasm", ... "/node_modules/tiktoken/lite/tiktoken_bg.wasm" ] You may copy the file to another places.

jsandlerus commented 1 year ago

@TateKim can you provide how you managed to get the yarn unplug working? As per this issue: https://github.com/vercel/vercel/discussions/4223 there are work arounds but I have been unable to make it work on my end.

iwasrobbed commented 1 year ago

See edge-safe examples here:

OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: https://github.com/anthropics/anthropic-tokenizer-typescript/issues/6#issuecomment-1747995284

eljommys commented 1 year ago

See edge-safe examples here:

OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: anthropics/anthropic-tokenizer-typescript#6 (comment)

Thanks a lot man. We're working on the next 1b dollar company and you saved our ass.

pushkarsingh32 commented 7 months ago

See edge-safe examples here: OpenAI: https://www.npmjs.com/package/js-tiktoken Anthropic: anthropics/anthropic-tokenizer-typescript#6 (comment)

Thanks a lot man. We're working on the next 1b dollar company and you saved our ass.

Thanks alot. Finally managed to find a way to count token on steaming. hope this works

pipeu commented 1 month ago

If you use js-tiktoken with the serverless framework, make sure to exclude it from the bundle.