Open AlexeyAB opened 4 years ago
@AlexeyAB I have the same problem,transpose op increases inference time。Do you have any solutions?
Same with you. Any method to deal with these transpose ops? it cost too much time.
Any update on this issue?
I have the same issue
I have same issue, pytorch(NCHW) --> onnx(NCHW) --> tf-saved-model(NHWC) [transpose layers introduced here].
@AlexeyAB Please suggest if you found solution?
I have same issue, pytorch(NCHW) --> onnx(NCHW) --> tf-saved-model(NHWC) [transpose layers introduced here].
@AlexeyAB Please suggest if you found solution?
I found the solution, perform following to convert form pytorch ---> tflite (NCHW ---> NHWC
):
[N,C,H,W] ---> [N,C,H,W]
[N,C,H,W] ---> [N,C,H,W]
(using openvino docker image)[N,C,H,W] ---> [N,H,W,C]
(using openvino2tensorflow :zap: :bulb: docker image, see https://github.com/PINTO0309/openvino2tensorflow.git)[N,H,W,C] ---> [N,H,W,C]
:smile: At this time, this tool, which is a greatly improved version of openvino2tensorflow, is much easier and more accurate for conversion. Incidentally, GroupedConvolution
is also supported.
"Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf)." https://github.com/PINTO0309/onnx2tf
Describe the bug
Why does
onnx-tensorflow
add Transpose layers for each Conv2D layer?Why does
onnx-tensorflow
use multiple Conv2D layers instead of one GroupedConv2D layer?It increases inference time, so PB-model that is converted by using
onnx-tensorflow
is slower than the same native TF-model.If I want to convert NCHW-ONNX to NCHW-PB to run on GPU, and then convert NCHW-PB to NHWC-TFLITE, then how should I do this? (Currently
tf.lite.TFLiteConverter.from_saved_model(saved_model_export_dir)
allows to convert NCHW-PB to NHWC-TFLITE without adding extra-transpose layers.To Reproduce
https://colab.research.google.com/gist/AlexeyAB/10b1bf880152b1ad7ca116b428863068/pytorch-onnx-tf-extra-transpose.ipynb
pytorch_onnx_tf_extra_transpose_ipynp.zip
Python, ONNX, ONNX-TF, Tensorflow version
This section can be obtained by running
get_version.py
from util folder.Additional context
PB-model with extra Transpose layers:
Result TFLITE-model with extra Transpose layers: