Open nitolpalak opened 2 months ago
The full error:
error: 'tf.TensorListSetItem' op is neither a custom op nor a flex op
error: failed while converting: 'main':
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: https://www.tensorflow.org/lite/guide/ops_select
TF Select ops: Angle, Exp, IRFFT, TensorListReserve, TensorListSetItem, TensorListStack
Details:
tf.Angle(tensor<?x?x257xcomplex<f32>>) -> (tensor<?x?x257xf32>) : {device = ""}
tf.Exp(tensor<?x?x257xcomplex<f32>>) -> (tensor<?x?x257xcomplex<f32>>) : {device = ""}
tf.IRFFT(tensor<?x?x257xcomplex<f32>>, tensor<1xi32>) -> (tensor<?x?x512xf32>) : {device = ""}
tf.TensorListReserve(tensor<2xi32>, tensor<i32>) -> (tensor<!tf_type.variant<tensor<?x128xf32>>>) : {device = ""}
tf.TensorListSetItem(tensor<!tf_type.variant<tensor<?x128xf32>>>, tensor<i32>, tensor<?x128xf32>) -> (tensor<!tf_type.variant<tensor<?x128xf32>>>) : {device = "", resize_if_index_out_of_bounds = false}
tf.TensorListStack(tensor<!tf_type.variant<tensor<?x128xf32>>>, tensor<2xi32>) -> (tensor<?x?x128xf32>) : {device = "", num_elements = -1 : i64}
Traceback (most recent call last):
File "/home/palak/Projects/Personal_Projects/integration/main.py", line 50, in <module>
tflite_model = converter.convert()
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/lite.py", line 1139, in wrapper
return self._convert_and_export_metrics(convert_func, *args, **kwargs)
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/lite.py", line 1093, in _convert_and_export_metrics
result = convert_func(self, *args, **kwargs)
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/lite.py", line 1601, in convert
saved_model_convert_result = self._convert_as_saved_model()
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/lite.py", line 1582, in _convert_as_saved_model
return super(TFLiteKerasModelConverterV2, self).convert(
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/lite.py", line 1371, in convert
result = _convert_graphdef(
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/convert_phase.py", line 212, in wrapper
raise converter_error from None # Re-throws the exception.
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/convert_phase.py", line 205, in wrapper
return func(*args, **kwargs)
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/convert.py", line 984, in convert_graphdef
data = convert(
File "/home/palak/Projects/Personal_Projects/integration/integration_test/lib/python3.10/site-packages/tensorflow/lite/python/convert.py", line 366, in convert
raise converter_error
tensorflow.lite.python.convert_phase.ConverterError: Could not translate MLIR to FlatBuffer. UNKNOWN: /home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.Angle' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.Exp' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListReserve' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListStack' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListReserve' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListStack' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.IRFFT' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListReserve' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListStack' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListReserve' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListStack' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListSetItem' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: called from
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListSetItem' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: called from
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListSetItem' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: called from
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: error: 'tf.TensorListSetItem' op is neither a custom op nor a flex op
tflite_model = converter.convert()
^
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: called from
tflite_model = converter.convert()
^
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
/home/palak/Projects/Personal_Projects/integration/main.py:50:1: note: Error code: ERROR_NEEDS_FLEX_OPS
tflite_model = converter.convert()
^
<unknown>:0: error: failed while converting: 'main':
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: https://www.tensorflow.org/lite/guide/ops_select
TF Select ops: Angle, Exp, IRFFT, TensorListReserve, TensorListSetItem, TensorListStack
Details:
tf.Angle(tensor<?x?x257xcomplex<f32>>) -> (tensor<?x?x257xf32>) : {device = ""}
tf.Exp(tensor<?x?x257xcomplex<f32>>) -> (tensor<?x?x257xcomplex<f32>>) : {device = ""}
tf.IRFFT(tensor<?x?x257xcomplex<f32>>, tensor<1xi32>) -> (tensor<?x?x512xf32>) : {device = ""}
tf.TensorListReserve(tensor<2xi32>, tensor<i32>) -> (tensor<!tf_type.variant<tensor<?x128xf32>>>) : {device = ""}
tf.TensorListSetItem(tensor<!tf_type.variant<tensor<?x128xf32>>>, tensor<i32>, tensor<?x128xf32>) -> (tensor<!tf_type.variant<tensor<?x128xf32>>>) : {device = "", resize_if_index_out_of_bounds = false}
tf.TensorListStack(tensor<!tf_type.variant<tensor<?x128xf32>>>, tensor<2xi32>) -> (tensor<?x?x128xf32>) : {device = "", num_elements = -1 : i64}
Hello @StuartIanNaylor . Thank you for your reply. But I am not sure how this helps in my case. Will you please explain me if possible? Sorry for the inconvenience.
Hello @StuartIanNaylor . Thank you for your reply. But I am not sure how this helps in my case. Will you please explain me if possible? Sorry for the inconvenience.
Kudos to the cool tool tflite2tensorflow made by PINTO0309. I used it to convert DTLN-aec tflite models to quantized version. Quantised models here using pinto's tools as quantising was problematic https://github.com/SaneBow/PiDTLN/tree/main/models
Hello,
I am trying to learn about quantization so was playing with a github repo trying to quantize it into int8 format. I have used the following code to quantize the model.
And for the representative data, I have converted the data into numpy, saved them as .npy and then used the following code to use them as representative data.
But after I run the code I get the following error:
I have tried to follow the doc and some github issues like https://github.com/tensorflow/tensorflow/issues/34350#issuecomment-579027135 and also went through a similar question Issue with tf.ParseExampleV2 when converting to Tensorflow Lite : "op is neither a custom op nor a flex op"
But none of those seemed to be helpful in my case.
Can anyone help me figure out what I am doing wrong? Thanks in advance.
I am adding the full error in my first comment.