Open xenova opened 1 year ago
confirming that I can duplicate
Bump. Using Bun (v0.6.9) with Transformers.js works under M2 Mac, but doesn't work when Bun (v1.0.2) under Ubuntu. Also is this related to #3574?
Also is this related to https://github.com/oven-sh/bun/issues/3574?
I believe so. Transformers.js uses onnxruntime-node, so any issues faced there would impact it too.
Anyone know what the cause of this is yet? I see two of the referenced issues are closed as "not planned"
is this an issue with bun, or an issue somewhere else?
Anyone know what the cause of this is yet? I see two of the referenced issues are closed as "not planned"
Fixing this is 100% planned. They are duplicates of this issue.
is this an issue with bun
yes. It is a bug in our NAPI implementation.
Hey @Jarred-Sumner - Any progress on this type of NAPI issue?
If you happen to only need transformers.js, I maintain a fork that runs on bun.
That is exactly my use case
Is your fork here? https://github.com/sroussey/transformers.js
The readme doesnt mention anything about bun, what exactly changes from the original to make it bun-compatible?
You can look at the publish branch. @sroussey/transformers on npm
Changes:
And 3: a change to set logging level as it complains a lot
You can also check out our v3 development branch (https://github.com/xenova/transformers.js/pull/545), which uses the latest version of onnxruntime-node and should run with bun. Please let me know if you run into any issues!
Thanks both
Back on a computer and not a phone.
Here are the changes: https://github.com/sroussey/transformers.js/compare/main...sroussey:transformers.js:publish?expand=1
They are very minimal!
I do not recommend using my fork for longer than needed. I keep it up to date, but only until the official one works on bun. And deals with the noisy error log.
The change to onnx to 1.16.0 fixed the crash. However, later versions have introduced bugs, so I don't plan to update that.
The WASM thing is because sometimes the native code errors and it retries with WASM, but not setting the thread count to 1 will error again.
Once you work with transformers.js on the command line, and build a tui interface, you will hate all the error logs about the models, so i put in a way to tap into changing that.
That is it.
I may fix some of the type stuff in the future, if that matters to you.
I made PRs for the things if I remember right, it has been a while. It may get back in to main version before v3.
BTW: v3 should have WebGPU enabled, as it is in onnx 1.17, but bun does not have webGPU. If you are using CUDA on Windows, it won't matter as the native onnx should use that. The CoreML stuff for native mac is not as well developed. It will be interesting to see how node with webgpu (or deno) compare in webgpu mode vs native. Native should win, but with less support for apple devices on onnx, who knows. Hopefully though, so bun won't miss out!
v3 seems to work great for me with no modifications needed. :)
There is not an npm package yet, so I published @sroussey/transformers@3.0.0-alpha.0
Running some tests, all looks good:
I have not tried switching to wasm, but I prefer native anyhow.
Just a quick note - v3 also works for me, bun add github:xenova/transformers.js#v3
works fine to grab it without having to publish to NPM though :)
@jkanavin-kdi for me running bun add github:xenova/transformers.js#v3
adds "@huggingface/transformers": "github:xenova/transformers.js#v3"
to package.json dependencies, but ts yields an error (screenshot)
okay, "@huggingface/transformers": "^3.0.0-alpha.14"
was installed by running bun add @huggingface/transformers
, works fine. here is the repo
in a debug build
❯ ~/src/bun3/build/debug/bun-debug index.js
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
22 | __classPrivateFieldSet(this, _OnnxruntimeSessionHandler_inferenceSession, new binding_1.binding.InferenceSession(), "f");
23 | if (typeof pathOrBuffer === 'string') {
24 | __classPrivateFieldGet(this, _OnnxruntimeSessionHandler_inferenceSession, "f").loadModel(pathOrBuffer, options);
25 | }
26 | else {
27 | __classPrivateFieldGet(this, _OnnxruntimeSessionHandler_inferenceSession, "f").loadModel(pathOrBuffer.buffer, pathOrBuffer.byteOffset, pathOrBuffer.byteLength, options);
^
error: Error
at new OnnxruntimeSessionHandler (/Users/meghandenny/src/test/node_modules/onnxruntime-node/dist/backend.js:27:92)
at /Users/meghandenny/src/test/node_modules/onnxruntime-node/dist/backend.js:64:29
Something went wrong during model construction (most likely a missing operation). Using `wasm` as a fallback.
[
{
label: "POSITIVE",
score: 0.999788761138916,
}
]
^C
What version of Bun is running?
1.0.0+822a00c4d508b54f650933a73ca5f4a3af9a7983
What platform is your computer?
Linux 5.15.0-1041-azure x86_64 x86_64
What steps can reproduce the bug?
Using Transformers.js causes a segmentation fault. See docs for information about the library.
What is the expected behavior?
It should output the same as when running with node.js:
What do you see instead?
Additional information
When running for the first time, you might run into an issue with the
sharp
dependency.You can fix it by running their recommended command:
npm install --ignore-scripts=false --foreground-scripts --verbose sharp
. I have tested running transformers.js with and without sharp, so I don't believe this to be the cause of the error.