Closed ambitious-octopus closed 2 months ago
Submit an issue to TensorFlow, it's a TensorFlow runtime issue.
Problems remain
YOLO11 conversion tflite int8 error
Traceback (most recent call last): File "/environment/miniconda3/lib/python3.11/site-packages/onnx2tf/onnx2tf.py", line 1566, in convert tflite_model = converter.convert() ^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1231, in wrapper return self._convert_and_export_metrics(convert_func, *args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1183, in _convert_and_export_metrics result = convert_func(self, *args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1562, in convert return self._convert_from_saved_model(graph_def) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1424, in _convert_from_saved_model return self._optimize_tflite_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 215, in wrapper raise error from None # Re-throws the exception. ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 205, in wrapper return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1127, in _optimize_tflite_model model = self._quantize( ^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 767, in _quantize return calibrate_quantize.calibrate_and_quantize( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 215, in wrapper raise error from None # Re-throws the exception. ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 205, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/optimize/calibrator.py", line 194, in calibrate_and_quantize return self._calibrator.QuantizeModel( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: Quantized dimension for tensor property and quantization parameters do not match. Got 3 and 0 respectively.
WARNING: INT8 Quantization with int16 activations tflite output failed. W0000 00:00:1727968968.603408 28874 tf_tfl_flatbuffer_helpers.cc:392] Ignored output_format. W0000 00:00:1727968968.603928 28874 tf_tfl_flatbuffer_helpers.cc:395] Ignored drop_control_dependency. Traceback (most recent call last): File "/environment/miniconda3/lib/python3.11/site-packages/onnx2tf/onnx2tf.py", line 1597, in convert tflite_model = converter.convert() ^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1231, in wrapper return self._convert_and_export_metrics(convert_func, *args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1183, in _convert_and_export_metrics result = convert_func(self, *args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1562, in convert return self._convert_from_saved_model(graph_def) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1424, in _convert_from_saved_model return self._optimize_tflite_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 215, in wrapper raise error from None # Re-throws the exception. ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 205, in wrapper return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 1127, in _optimize_tflite_model model = self._quantize( ^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/lite.py", line 767, in _quantize return calibrate_quantize.calibrate_and_quantize( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 215, in wrapper raise error from None # Re-throws the exception. ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/convert_phase.py", line 205, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/environment/miniconda3/lib/python3.11/site-packages/tensorflow/lite/python/optimize/calibrator.py", line 194, in calibrate_and_quantize return self._calibrator.QuantizeModel( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: Quantized dimension for tensor property and quantization parameters do not match. Got 3 and 0 respectively. WARNING: Full INT8 Quantization with int16 activations tflite output failed.
Issue Type
Others
OS
Linux
onnx2tf version number
1.22.3
onnx version number
1.16.2
onnxruntime version number
1.19.0
onnxsim (onnx_simplifier) version number
0.4.36
tensorflow version number
2.17.0
Download URL for ONNX
https://filetransfer.io/data-package/m6Wg8EXk#link
Parameter Replacement JSON
Description
Running the command:
Lead to:
for INT8 Quantization with int16 activations tflite.