PINTO0309 / onnx2tf

Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf). I don't need a Star, but give me a pull request.
MIT License
662 stars 65 forks source link

[HybridNets] Hybridnets success converted but shows error when convert to int8 and inference #311

Closed ChRiSsHiH11 closed 1 year ago

ChRiSsHiH11 commented 1 year ago

Issue Type

Others

onnx2tf version number

1.9.1

onnx version number

1.13.1

tensorflow version number

2.12.0

Download URL for ONNX

https://drive.google.com/file/d/1_jVTFW2AalBghRMzQez5u1hpcizGKvZc/view?usp=share_link

Parameter Replacement JSON

No replacement

Description

  1. My homework.
  2. It can successfully saved model, but when I inference the FP32.tflite and convert to int 8 error shows. RuntimeError : input_channel % filter_input_channel != 0 (1 != 0)Node number 78 (CONV_2D) failed to prepare.
  3. I have try different opset_version (9, 11, 12), different onnx2tf version and also convert the .pth file that provide in https://github.com/datvuthanh/HybridNets
  4. It need to run on TF embedded, or i might failed in this class.
  5. All the training and export to onnx are reference to https://github.com/datvuthanh/HybridNets
PINTO0309 commented 1 year ago

I will be out of town for three days starting today, so I will not be able to operate my PC and proceed with various investigations.

https://github.com/PINTO0309/PINTO_model_zoo/tree/main/276_HybridNets

https://user-images.githubusercontent.com/33194443/231661686-85d83892-fe29-45bc-9564-0cb4de7ca042.mp4

ChRiSsHiH11 commented 1 year ago

I have tried convert by using the onnx file in model zoo. it works and inference result are correct. I also build up a new env to export same onnx format in model zoo. It seems solved the convolution issue, but come out of reshape issue.

PINTO0309 commented 1 year ago

Unless there is an error message or a detailed description of the situation, no one will be able to comment or reproduce it.

ChRiSsHiH11 commented 1 year ago

I know. I'm just trying to test more different situations. I'll show results or error message in few days

ChRiSsHiH11 commented 1 year ago

Compared with model zoo, the ONNX converted through the same environment has little differences, such as the part in the box in the figure. When converting the onnx into tflite through this Onnx, it also has differences on Pad Node. Is there anything to pay attention to in the conversion of Pad node?

I also notice that the architecture seems to be a little different. Have you adjusted the model?

Onnx TFlite

PINTO0309 commented 1 year ago

I just disabled MemoryEfficientSwish().

Btw, trying to examine multiple issues at the same time is confusing. It is difficult for me to pinpoint exactly what the problem is because I cannot reproduce the environment you have at your disposal.

The latest version of onnxsim has a bug. https://github.com/PINTO0309/onnx2tf/issues/312#issuecomment-1509871390

ChRiSsHiH11 commented 1 year ago

It can successfully inference and quantization after downgrading the onnxsim.

Thank you for all your assistance!!