Tencent / TPAT

TensorRT Plugin Autogen Tool
Apache License 2.0
365 stars 42 forks source link

test_tpat.py error #5

Closed GeneralJing closed 2 years ago

GeneralJing commented 2 years ago

Traceback (most recent call last): File "test_tpat.py", line 3860, in test_abs() File "test_tpat.py", line 360, in test_abs op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name) File "test_tpat.py", line 346, in op_expect verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result) File "test_tpat.py", line 251, in verify_with_ort_with_trt ort_result = get_onnxruntime_output(model, inputs) File "test_tpat.py", line 225, in get_onnxruntime_output rep = onnxruntime.backend.prepare(model, "CPU") File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 138, in prepare return cls.prepare(bin, device, *kwargs) File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 114, in prepare inf = InferenceSession(model, sess_options=options, providers=providers) File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::basic_string, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees* support for models stamped with official released onnx opset versions. Opset 16 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx is till opset 15.

panlinchao commented 2 years ago

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

GeneralJing commented 2 years ago

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

this really helps to fix the problem. but when the program runs, it occurs another error:

trt cross_check output False Traceback (most recent call last): File "test_tpat.py", line 3860, in test_abs() File "test_tpat.py", line 360, in test_abs op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name) File "test_tpat.py", line 346, in op_expect verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result) File "test_tpat.py", line 300, in verify_with_ort_with_trt assert ret, "result check False" AssertionError: result check False

buptqq commented 2 years ago

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

this really helps to fix the problem. but when the program runs, it occurs another error:

trt cross_check output False Traceback (most recent call last): File "test_tpat.py", line 3860, in test_abs() File "test_tpat.py", line 360, in test_abs op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name) File "test_tpat.py", line 346, in op_expect verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result) File "test_tpat.py", line 300, in verify_with_ort_with_trt assert ret, "result check False" AssertionError: result check False

run pip install onnxtruntime==1.9.0 and pip install onnx==1.10.0. we have update this to DockerFile. And you can refer to https://github.com/Tencent/TPAT/tree/main/examples if you use tensorflow.

GeneralJing commented 2 years ago

thank you. later, i will try that.