onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.26k stars 298 forks source link

The converted tensorflow model outputs results different from the original onnx model #1029

Open thanhlct opened 2 years ago

thanhlct commented 2 years ago

Describe the bug

The outputs of the converted model with onnx_tf.backend.prepare is different from the original onnx model

To Reproduce `import numpy as np ort_inputs = {'input1': np.random.rand(1, 3, 48, 160).astype(np.float32), 'input2': np.random.randint(2, size=(1,240))==1}

Test onnx

print('test onnx') import onnxruntime as ort osession = ort.InferenceSession("nocr.onnx") ort_outputs = osession.run(None, ort_inputs) ort_outputs = ort_outputs[0]

tf run

print('test onnx_tf') import onnx from onnx_tf.backend import prepare onnx_model = onnx.load("./nocr.onnx") tf_rep = prepare(onnx_model) toutputs = tf_rep.run(ort_inputs) toutputs = toutputs.output1 print('===>> results onnx vs onnx_tf backend, max gap', np.max(np.abs(ort_outputs - toutputs)), ', max onnx output:', np.max(ort_outputs), ', min onnx ouput:', np.min(ort_outputs))`

ONNX model file The onnx model attached

Python, ONNX, ONNX-TF, Tensorflow version

Additional context

Tested with some lower versions of tensorflow, but the models output gap is still occurs

thanhlct commented 2 years ago

Please give us some clues, we can't figure out how to find the root of the problem