onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.26k stars 298 forks source link

pytoch->onnx->tensorflow how to conevrt from NCHW(onnx) to NHWC(tensorflow lite) #862

Open hufangjian opened 3 years ago

hufangjian commented 3 years ago

I convert a model from pytorch to onnx and than to tflite, the tflite shape is NCHW.run slowly on android , but the NHWC shape tflite is faster. the same model.

chinhuang007 commented 3 years ago

The data format is NCHW in ONNX. We can convert a model to either NCHW or NHWC in Tensorflow. Please make sure the device is set to "CPU", by passing the argument to either CLI or prepare(). Then the converted model is in NHWC.

hufangjian commented 3 years ago

my comond: onnx-tf convert -i Zero_DCE_640_dele.sim.onnx -o test --device CPU
but the tensorflow lite mode is NCHW not NHWC. image

chinhuang007 commented 3 years ago

In that case, try --device CUDA. The model will have NCHW wherever possible.

hufangjian commented 3 years ago

I try: onnx-tf convert -i Zero_DCE_640_dele.sim.onnx -o test --device CUDA It did no worked. have you ever test some model?

chinhuang007 commented 3 years ago

We have tests with all ONNX model zoo models.

Please be more specific with your issue. Is the conversion not working? or the inference not working, throwing exceptions? or the inference results are different from original model in Pytorch?

Also it would be nice to include the onnx file so others can help debugging.

hufangjian commented 3 years ago

this is my onnx file which convert from pytorch. the input shape is (1x3x360x640 ) NCHW。 model.zip

  1. run “onnx-tf convert -i Zero_DCE_640_dele.sim.onnx -o test --device CUDA“ to tensorflow save_model

  2. convert save_model to tflite

import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model("test") tflite_model = converter.convert() with open('Zero_dce.tflite', 'wb') as f: f.write(tflite_model)

the tflite model shape is (1x3x360x640) NCHW not (1x360x640x3) NHWC format.

on android device, NCHW and NHWC all can run. but the NHWC is faster than NHWC about 5 times.

how can I get the NHWC format from onnx?

chinhuang007 commented 3 years ago

Currently ONNX supports NCHW only. That means the model and node inputs must be in NCHW so the operators can work according to the specs. In order to support NHWC, additional option is needed in ONNX to indicate the data format, NCHW or NHWC.

Please open an issue (feature request) in ONNX core project, https://github.com/onnx/onnx/issues, if this feature is critical for you. Thanks!

buptlj commented 3 years ago

this is my onnx file which convert from pytorch. the input shape is (1x3x360x640 ) NCHW。 model.zip

  1. run “onnx-tf convert -i Zero_DCE_640_dele.sim.onnx -o test --device CUDA“ to tensorflow save_model
  2. convert save_model to tflite

import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model("test") tflite_model = converter.convert() with open('Zero_dce.tflite', 'wb') as f: f.write(tflite_model)

the tflite model shape is (1x3x360x640) NCHW not (1x360x640x3) NHWC format.

on android device, NCHW and NHWC all can run. but the NHWC is faster than NHWC about 5 times.

how can I get the NHWC format from onnx?

Hi, have you solved the problem?

hufangjian commented 3 years ago

通过openvino转换。

buptlj commented 3 years ago

通过openvino转换。

我也尝试了,pytorch->onnx->openvino->tensorflow,这个流程太长了,最后输出结果误差较大。

caijinana commented 3 years ago

tfl.fully_connected' op expect 2d filter, got 'tensor<1x512x784xf32>'

yahuuu commented 3 years ago

this is my onnx file which convert from pytorch. the input shape is (1x3x360x640 ) NCHW。 model.zip

  1. run “onnx-tf convert -i Zero_DCE_640_dele.sim.onnx -o test --device CUDA“ to tensorflow save_model
  2. convert save_model to tflite

import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model("test") tflite_model = converter.convert() with open('Zero_dce.tflite', 'wb') as f: f.write(tflite_model)

the tflite model shape is (1x3x360x640) NCHW not (1x360x640x3) NHWC format.

on android device, NCHW and NHWC all can run. but the NHWC is faster than NHWC about 5 times.

how can I get the NHWC format from onnx?

How to get NHWC format .pb from onnx Directly? @chinhuang007

talcs commented 2 years ago

You might want to take a look at onnx2keras' code for changing the dimension ordering from NCHW to NHWC.

Note:

When installing onnx2keras using pip, the change_ordering feature would fail the whole conversion on wrong mismatch of BN layer number of channels. However, when I clone the git repo and checkout the commit 24f1eaf716d01032bbdd1552a0023e23185454d6 and then install it using python setup.py install, the conversion with changing of ordering works like a magic.

abhishek27m1992github commented 5 months ago

additional option is needed in ONNX to indicate the data format, NCHW or NHWC.

what is the option ?? and How to do it ?