microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
13.6k stars 2.77k forks source link

[Web] LinkError when using custom built WASM artifacts #20970

Open miguel-lorenzo opened 1 month ago

miguel-lorenzo commented 1 month ago

Describe the issue

When trying to use a custom built WASM artifact, the following error is thrown after downloading:

wasm streaming compile failed: LinkError: WebAssembly.instantiate(): Import #36 module="a" function="K" error: function import requires a callable

WASM artifacts were built following https://onnxruntime.ai/docs/build/web.html#build-onnx-runtime-webassembly-artifacts with the following comand:

./build.sh --config Release --build_wasm --skip_tests --parallel --enable_wasm_simd

At first I tried the --minimal-build because I wanted to use a lighter .wasm, but not even the "full" artifact works for me. WASM files from CDN run my model without problems. I am using the same version (v1.18.0) both when building the artifacts and running the model. Files are served properly; 200 OK responses are returned.

Btw, having minimal builds of the WASM files on CDN would be great!

To reproduce

  1. Build a custom WASM artifact following the instructions
  2. Just use it with any model

Urgency

Not urgent

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.18.0

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

fs-eire commented 1 month ago

When you build from source, you need to build your own ort.min.js file using the wasm artifacts. Otherwise there will be a function table mismatch error (as shown in issue description)

github-actions[bot] commented 12 hours ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.