PINTO0309 / onnx2tf

Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf). I don't need a Star, but give me a pull request.
MIT License
712 stars 73 forks source link

how to fuse activation into conv: fused_activation_function=NONE #613

Closed helloworld77 closed 7 months ago

helloworld77 commented 7 months ago

Issue Type

Feature Request

OS

Linux

onnx2tf version number

1.10.0

onnx version number

1.10.2

onnxruntime version number

none

onnxsim (onnx_simplifier) version number

none

tensorflow version number

2.13.0

Download URL for ONNX

https://github.com/onnx/models/blob/main/Computer_Vision/resnet18_Opset18_torch_hub/resnet18_Opset18.onnx

Parameter Replacement JSON

none

Description

  1. Run fast on NPU by tflite nnapi support
  2. Tensorflow official quant model: http://download.tensorflow.org/models/mobilenet_v1_2018_08_02/mobilenet_v1_1.0_224_quant.tgz, in this model, relu6 activation is fuse into conv and fused_activation_function=NONE
  3. I tried to analysis this quant mode, and I found that was compatible with tensorflow <= 1.12.x.
  4. Fuse activation into conv ops could make the run fast in unisoc.
  5. I tried to convert resnet18_Opset18.onnx by onnx-tf, but I can't get fused_activation_function=NONE.
PINTO0309 commented 7 months ago

relu6 activation is fuse into conv

It is completely wrong.

image

No wonder.

image

helloworld77 commented 7 months ago

relu6 activation is fuse into conv

It is completely wrong.

image

No wonder.

image

Thanks for your Response~~ I don't think it was wrong. take a look at the scale and zero point formate:

image

The old formate compute the relu-like activation into the scale computation.

PINTO0309 commented 7 months ago

I am posting only the reference URL for the research process as I have to be out of town for an extended period of time. I will check it when I get home.

https://stackoverflow.com/questions/62000090/how-does-tflite-fuses-relu-into-conv-layers

helloworld77 commented 7 months ago

I am posting only the reference URL for the research process as I have to be out of town for an extended period of time. I will check it when I get home.

https://stackoverflow.com/questions/62000090/how-does-tflite-fuses-relu-into-conv-layers

Thanks ! I found that this kind of fusion (fused_activation_function=NONE) was adopt by tensorflow<=1.12.x version, after this version, activation like relu was explicit fused_activation_function=RELU, there should be some switch button in latest tensorflow, because the type: FusedActivationFunctionType::kNone is still in main branch of tensorflow. Thank you again for your great work.

helloworld77 commented 7 months ago

I am posting only the reference URL for the research process as I have to be out of town for an extended period of time. I will check it when I get home.

https://stackoverflow.com/questions/62000090/how-does-tflite-fuses-relu-into-conv-layers

I will take a look at this solution. Thanks again.

PINTO0309 commented 7 months ago

I have looked at all of the API in v2.16.1 and there does not appear to be any flags or options that you would expect. Once I remove the TODO flag, the bot will automatically close this issue after 5 days if there is no new feedback from you.

helloworld77 commented 7 months ago

I have looked at all of the API in v2.16.1 and there does not appear to be any flags or options that you would expect. Once I remove the TODO flag, the bot will automatically close this issue after 5 days if there is no new feedback from you.

Thanks, It really seems not have that flags. I will use tf-v1.x as a solution.