The DeepLabV3_resnet50_vaiq_int8 model fails to execute on nod-ai/onnxruntime's IREE EP, running on llvm-cpu backend. The error logs have been attached[1]. The model.onnx file was generated using the following command:
As test input and output for the EP, the files test-run/onnx/models/DeepLabV3_resnet50_vaiq_int8/DeepLabV3_resnet50_vaiq_int8.default.input.pt and test-run/onnx/models/DeepLabV3_resnet50_vaiq_int8/DeepLabV3_resnet50_vaiq_int8.default.goldoutput.pt were used.
The tests were placed as input_0.pt and output_0.pt under the test_data_set_0/ directory in the directory containing model.onnx.
The EP was run with the command:
The
DeepLabV3_resnet50_vaiq_int8
model fails to execute onnod-ai/onnxruntime
's IREE EP, running on llvm-cpu backend. The error logs have been attached[1]. Themodel.onnx
file was generated using the following command:Test Details
As test input and output for the EP, the files
test-run/onnx/models/DeepLabV3_resnet50_vaiq_int8/DeepLabV3_resnet50_vaiq_int8.default.input.pt
andtest-run/onnx/models/DeepLabV3_resnet50_vaiq_int8/DeepLabV3_resnet50_vaiq_int8.default.goldoutput.pt
were used.The tests were placed as
input_0.pt
andoutput_0.pt
under thetest_data_set_0/
directory in the directory containingmodel.onnx
. The EP was run with the command:followed by
where
DeepLabModel
is the dir that containsmodel.onnx
.Build command for
nod-ai/onnxruntime
To build
nod-ai/onnxruntime
, the following build command was used:Build command for IREE
To build IREE, the following build command was used:
This is a cpu-backend only build, with
cpuinfo
disabled to make it compatible withonnxruntime
.Other failures
The first command stated above:
fails at
iree-compile
, with iree-compile.log also being attached[2].[1] DeepLabV3Resnet50EPFailure.txt [2] iree-compile.log