Open mohanrajroboticist opened 1 year ago
https://github.com/onnx/onnx-tensorflow/issues/862
https://github.com/onnx/onnx-tensorflow/issues/862#issuecomment-776303182
Currently ONNX supports NCHW only. That means the model and node inputs must be in NCHW so the operators can work according to the specs. In order to support NHWC, additional option is needed in ONNX to indicate the data format, NCHW or NHWC.
The main problem is converting from pytorch to onnx to tensorflow . Even though it doesnot give any error . The convertor (onnx_tf) is saving the model in the format of pytorch itself. Ans they have an open issue on this .
Channels-First: NCHW - The channels come before the height and width dimensions
Channels-Last: NHWC - The channels come after the height and width dimensions
Performing post training full integer quantisation of PyTorch developed model in TensorFlow framework throws following error
PyTorch model -> ONNX model -> TensorFlow model -> TensorFlow PTQ Quantisation
Reference: https://stackoverflow.com/questions/66957392/tensorflow-lite-runtimeerror