microsoft / onnxruntime-inference-examples

Examples for using ONNX Runtime for machine learning inferencing.
MIT License
1.16k stars 331 forks source link

The onnx model can't be loaded on the front end, which uses the react architecture #427

Open tanggang1997 opened 4 months ago

tanggang1997 commented 4 months ago

failed to inference ONNX model: Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(CompileError: WebAssembly.instantiate(): expected magic word 00 61 73 6d, found 3c 21 44 4f @+0). Build with -sASSERTIONS for more info., [cpu] Error: previous call to 'initializeWebAssembly()' failed., [xnnpack] Error: previous call to 'initializeWebAssembly()' failed..failed to inference ONNX model: Error: no available backend found. ERR: . failed to inference ONNX model: Error: no available backend found. ERR: .