PINTO0309 / onnx2tf

Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf). I don't need a Star, but give me a pull request.
MIT License
712 stars 73 forks source link

Discrepancies between Conv layer outputs of the ONNX model and converted TensorFlow models #605

Closed SuhwanSong closed 7 months ago

SuhwanSong commented 7 months ago

Issue Type

Feature Request

OS

Linux

onnx2tf version number

1.19.14

onnx version number

1.15.0

onnxruntime version number

1.17.0

onnxsim (onnx_simplifier) version number

0.4.33

tensorflow version number

2.15.0

Download URL for ONNX

https://github.com/PINTO0309/onnx2tf/files/14982911/poc.zip

Parameter Replacement JSON

{}

Description

When converting an ONNX model to TensorFlow Lite (TFLite) format and comparing the outputs of the ONNX model, TensorFlow model, and TFLite model, discrepancies are observed between the ONNX model and both TensorFlow models.

Steps to Reproduce

poc.zip

  1. Download and unzip "poc.zip' to get "poc.onnx".
  2. Run the following code with poc file.
    onnx2tf -i ./poc.onnx -cotof
image

Screenshots

poc onnx

Why

Solving discrepancies between the outputs of the ONNX model and both TensorFlow models will ensure consistent inference results across different runtime environments. This consistency is crucial for deploying machine learning models in production systems, where uniform behavior is expected regardless of the framework or format used.

PINTO0309 commented 7 months ago

Unfortunately, there is always an error of about 1e-4 due to differences in run-time calculation specifications.

If you can't tolerate that minor error, don't use tflite.

SuhwanSong commented 7 months ago

Unfortunately, there is always an error of about 1e-4 due to differences in run-time calculation specifications.

If you can't tolerate that minor error, don't use tflite.

Thanks :)

Sorry for bothering you but there is a discrepancy between the outputs of the TensorFlow model and the ONNX model. Note that the outputs of TensorFlow Lite and TensorFlow models match.

Here's the new poc file poc.zip

from os.path import join
import tensorflow
import onnxruntime
import numpy as np

import onnx2tf
from einops import rearrange

if __name__ == "__main__" :

    onnx_model_path = 'poc.onnx'
    tf_output_path = './tf_path'

    # Convert ONNX model into TensorFlow
    onnx2tf.convert(
        input_onnx_file_path=onnx_model_path,
        output_folder_path=tf_output_path,
        copy_onnx_input_output_names_to_tflite=True,
        non_verbose=True,
    )

    # input
    input_np = np.random.randn(1, 3, 224, 224).astype('f')

    # load and run onnx model
    ort_session = onnxruntime.InferenceSession(onnx_model_path)
    ort_output  = ort_session.run(None, {'x' : input_np})

    # Prepare input for TensorFlow models
    input_for_tf = rearrange(input_np, 'b c h w -> b h w c')

    interpreter = tensorflow.lite.Interpreter(join(tf_output_path, 'poc_float32.tflite'))
    input_details = interpreter.get_input_details()
    output_details = interpreter.get_output_details()

    interpreter.allocate_tensors()

    interpreter.set_tensor(input_details[0]['index'], input_for_tf)
    interpreter.invoke()
    output_data = interpreter.get_tensor(output_details[0]['index'])
    #print (output_data)

    # Load and run TensorFlow model
    tf_model = tensorflow.saved_model.load(tf_output_path)
    tf_output = tf_model(input_for_tf)

    # Compare ONNX and TensorFlow outputs
    if np.allclose(ort_output, tf_output, rtol=1e-03, atol=1e-04):
        print("Test Passed: ONNX and TensorFlow outputs match\n")
    else:
        print("Test Failed: ONNX and TensorFlow outputs differ\n")

    # Compare TFlite and TensorFlow outputs
    if np.allclose(output_data, tf_output, rtol=1e-03, atol=1e-04):
        print("Test Passed: TFlite and TensorFlow outputs match\n")
    else:
        print("Test Failed: TFlite and TensorFlow outputs differ\n")

image

PINTO0309 commented 7 months ago

Sorry, but I don't understand the point you are trying to make. The final outputs of ONNX and TFLite match.

pip show onnx2tf

Name: onnx2tf
Version: 1.19.15

onnx2tf -i poc.onnx -cotof

poc_float32.tflite.zip

image