Error:
`
Load model from onnx_static_quantizer3.onnx failed:/proj/xcdhdstaff1/huizhang/workspace/xcdl190091/workspace/sft_ai_sdk_pack_2022.2/tmp/edge/src/onnxruntime/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::__cxx11::basic_string, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 19 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain com.ms.internal.nhwc is till opset 17.
Stacktrace:
/usr/lib/libonnxruntime.so.1.14.0(+0x75cd38) [0xffffb1602d38]
/usr/lib/libonnxruntime.so.1.14.0(+0x75e83c) [0xffffb160483c]
/usr/lib/libonnxruntime.so.1.14.0(+0x75e9c4) [0xffffb16049c4]
/usr/lib/libonnxruntime.so.1.14.0(+0x75ed00) [0xffffb1604d00]
/usr/lib/libonnxruntime.so.1.14.0(+0xe5370) [0xffffb0f8b370]
/usr/lib/libonnxruntime.so.1.14.0(+0xf6adc) [0xffffb0f9cadc]
/usr/lib/libonnxruntime.so.1.14.0(+0xf70d4) [0xffffb0f9d0d4]
/usr/lib/libonnxruntime.so.1.14.0(+0xffc50) [0xffffb0fa5c50]
/usr/lib/libonnxruntime.so.1.14.0(+0xa4f04) [0xffffb0f4af04]
/usr/lib/libonnxruntime.so.1.14.0(+0xa54d8) [0xffffb0f4b4d8]
./test_yolov8_cpp_onnx(+0x58e4) [0xaaaae26b58e4]
./test_yolov8_cpp_onnx(+0x33a4) [0xaaaae26b33a4]
/lib/libc.so.6(+0x2b030) [0xffffb0a56030]
/lib/libc.so.6(__libc_start_main+0x98) [0xffffb0a56108]
./test_yolov8_cpp_onnx(+0x3cb0) [0xaaaae26b3cb0]
Aborted
`
To reproduce :
1- Export Yolov8-seg using
!yolo export model='model.pt' opset=12 imgsz=480,640 format=onnx
2- Used vai_q_onnx quantizer
3- used resnet_pt as reference for inference and it gives this erro while loading the onnx quantized model.
from the error message,
Opset 19 is under development and support for this is limited.
Current official support for domain com.ms.internal.nhwc is till opset 17.
It seems this is not supported now.
Error: ` Load model from onnx_static_quantizer3.onnx failed:/proj/xcdhdstaff1/huizhang/workspace/xcdl190091/workspace/sft_ai_sdk_pack_2022.2/tmp/edge/src/onnxruntime/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::__cxx11::basic_string, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 19 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain com.ms.internal.nhwc is till opset 17.
Stacktrace:
/usr/lib/libonnxruntime.so.1.14.0(+0x75cd38) [0xffffb1602d38]
/usr/lib/libonnxruntime.so.1.14.0(+0x75e83c) [0xffffb160483c]
/usr/lib/libonnxruntime.so.1.14.0(+0x75e9c4) [0xffffb16049c4]
/usr/lib/libonnxruntime.so.1.14.0(+0x75ed00) [0xffffb1604d00]
/usr/lib/libonnxruntime.so.1.14.0(+0xe5370) [0xffffb0f8b370]
/usr/lib/libonnxruntime.so.1.14.0(+0xf6adc) [0xffffb0f9cadc]
/usr/lib/libonnxruntime.so.1.14.0(+0xf70d4) [0xffffb0f9d0d4]
/usr/lib/libonnxruntime.so.1.14.0(+0xffc50) [0xffffb0fa5c50]
/usr/lib/libonnxruntime.so.1.14.0(+0xa4f04) [0xffffb0f4af04]
/usr/lib/libonnxruntime.so.1.14.0(+0xa54d8) [0xffffb0f4b4d8]
./test_yolov8_cpp_onnx(+0x58e4) [0xaaaae26b58e4]
./test_yolov8_cpp_onnx(+0x33a4) [0xaaaae26b33a4]
/lib/libc.so.6(+0x2b030) [0xffffb0a56030]
/lib/libc.so.6(__libc_start_main+0x98) [0xffffb0a56108]
./test_yolov8_cpp_onnx(+0x3cb0) [0xaaaae26b3cb0]
Aborted `
To reproduce : 1- Export Yolov8-seg using
!yolo export model='model.pt' opset=12 imgsz=480,640 format=onnx
2- Used vai_q_onnx quantizer 3- used resnet_pt as reference for inference and it gives this erro while loading the onnx quantized model.