microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.05k stars 2.83k forks source link

[Web] #21318

Open guanzongjiang opened 1 month ago

guanzongjiang commented 1 month ago

Describe the issue

Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(LinkError: WebAssembly.instantiate(): Import #46 "a" "V": function import requires a callable). Build with -sASSERTIONS for more info., [cpu] Error: previous call to 'initWasm()' failed. at Dn (onnxruntime-web.js?v=641f318a:80:13) at async a.create (onnxruntime-web.js?v=641f318a:600:20) at async initModel (AnythingModel.vue:44:17) at async initSegment (AnythingModel.vue:26:7) at async AnythingModel.vue:137:3

To reproduce

vue3+vite ort.env.wasm.wasmPaths = { 'ort-wasm.wasm': 'https://cdnjs.cloudflare.com/ajax/libs/onnxruntime-web/1.14.0/ort-wasm.wasm', 'ort-wasm-threaded.wasm': 'https://cdnjs.cloudflare.com/ajax/libs/onnxruntime-web/1.14.0/ort-wasm-threaded.wasm', 'ort-wasm-simd.wasm': 'https://cdnjs.cloudflare.com/ajax/libs/onnxruntime-web/1.14.0/ort-wasm-simd.wasm', 'ort-wasm-simd-threaded.wasm': 'https://cdnjs.cloudflare.com/ajax/libs/onnxruntime-web/1.14.0/ort-wasm-simd-threaded.wasm' }

var modelFile : string = "http://localhost:5173/anythingModel/sam_onnx_quantized_example.onnx"; model = await InferenceSession.create(modeUrl);

Urgency

No response

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.18.0

Execution Provider

'wasm'/'cpu' (WebAssembly CPU)

fs-eire commented 1 month ago

When specifying web assembly file path override, the version of the .wasm file(s) need to match the JavaScript file version (NPM package onnxruntime-web, or the version specified in the ort.min.js). Using v1.18.0 with path override to .wasm files of v1.14.0 will not work.