Open TongMeng-AI opened 7 months ago
Please remove flag --disable_wasm_exception_catching --disable_rtti
and build again. Should show some error message.
Please remove flag
--disable_wasm_exception_catching --disable_rtti
and build again. Should show some error message.
I recompiled as you suggested and then ran it and got an error specific message as follows: Uncaught (in promise) RuntimeError: null function or function signature mismatch
I didn't see this error before. To further investigate the issue, you may need to add "-s DEMANGLE_SUPPORT=1" in your link flags to check the function names in the stack.
I didn't see this error before. To further investigate the issue, you may need to add "-s DEMANGLE_SUPPORT=1" in your link flags to check the function names in the stack. According to your suggestion, I added -s DEMANGLE_SUPPORT=1 during compilation, and a warning appeared: em++: warning: DEMANGLE_SUPPORT is deprecated since mangled names no longer appear in stack traces [-Wdeprecated]
Is there a version correspondence or match between emscripten and onnxruntime?
we are now using emsdk 3.1.51 in onnxruntime main branch.
we are now using emsdk 3.1.51 in onnxruntime main branch.
I use emsdk 3.1.54,will this have any impact?
I am not sure. Maybe you can use 3.1.51 to take a try
I am not sure. Maybe you can use 3.1.51 to take a try
Can you provide a simple C++ demo based on ONNX Runtime? I want to test if it can run successfully with libonnxruntime_webassembly.a. Is it possible that there's an error in my C++ inference program?
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the issue
I want to perform inference using ONNX Runtime in C++, and call it from H5 using JS. Loading the model and outputting model-related information work fine, but I encounter an error during the session.run() operation. I'm not sure what specifically is causing the issue. Could you provide me with some suggestions?
To reproduce
I built libonnxruntime_webassembly.a using the following command: ./build.sh --config Release --build_dir build_wasm --skip_tests --build_wasm_static_lib --enable_wasm_simd --enable_wasm_threads --skip_submodule_sync --allow_running_as_root --disable_wasm_exception_catching --disable_rtti
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.17.0
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)