I am encountering when trying to convert a SavedModel.pb into a .tflite which tells me that the TFLite interperter needs to link Flex delegate to run the model. Here is the entire script of error. (The code I run is below)
2021-10-15 21:13:24.476697: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2021-10-15 21:13:24.477513: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2021-10-15 21:14:12.107673: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:351] Ignored output_format.
2021-10-15 21:14:12.107758: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:354] Ignored drop_control_dependency.
2021-10-15 21:14:12.107821: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:360] Ignored change_concat_input_ranges.
2021-10-15 21:14:12.108753: I tensorflow/cc/saved_model/reader.cc:38] Reading SavedModel from: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model
2021-10-15 21:14:12.181869: I tensorflow/cc/saved_model/reader.cc:90] Reading meta graph with tags { serve }
2021-10-15 21:14:12.181946: I tensorflow/cc/saved_model/reader.cc:132] Reading SavedModel debug info (if present) from: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model
2021-10-15 21:14:12.182174: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2021-10-15 21:14:12.504754: I tensorflow/cc/saved_model/loader.cc:211] Restoring SavedModel bundle.
2021-10-15 21:14:13.431946: I tensorflow/cc/saved_model/loader.cc:195] Running initialization op on SavedModel bundle at path: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model
2021-10-15 21:14:13.696451: I tensorflow/cc/saved_model/loader.cc:283] SavedModel load for tags { serve }; Status: success: OK. Took 1587684 microseconds.
2021-10-15 21:14:14.479694: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:210] disabling MLIR crash reproducer, set env var MLIR_CRASH_REPRODUCER_DIRECTORY to enable.
2021-10-15 21:14:15.071035: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2021-10-15 21:14:15.760240: W tensorflow/compiler/mlir/lite/flatbuffer_export.cc:1838] TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following flex op(s):
Flex ops: FlexConcatV2
Details:
tf.ConcatV2(tensor, tensor, tensor, tensor, tensor) -> (tensor<4xf32>) : {device = ""}
3. Steps to reproduce
I first go to Tensorflow Lite converter website (https://www.tensorflow.org/lite/convert) and the copied code below "Convert a SavedModel (recommended)" with slight changes.
import tensorflow as tf
saved_model_dir = "C:\\Users\\godlo\\Documents\\TensorFlow\\workspace\\training_demo\\exported-models\\my_model\\saved_model"
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS, # enable TensorFlow ops.
]
tflite_model = converter.convert()
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
By then I already have the .pb file ready to be converted, the save_model_dir is directing to the .pb file.
When I run the code, the error above is what I got
4. Expected behavior
I will get a .tflite file
5. System information
Windows 10 64bit
TensorFlow installed from 'pip install --ignore-installed --upgrade tensorflow==2.5.0'
Prerequisites
Please answer the following questions for yourself before submitting an issue.
1. The entire URL of the file you are using
https://github.com/tensorflow/models/tree/master/research/object_detection
2. Describe the bug
I am encountering when trying to convert a SavedModel.pb into a .tflite which tells me that the TFLite interperter needs to link Flex delegate to run the model. Here is the entire script of error. (The code I run is below)
2021-10-15 21:13:24.476697: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. 2021-10-15 21:13:24.477513: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance. 2021-10-15 21:14:12.107673: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:351] Ignored output_format. 2021-10-15 21:14:12.107758: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:354] Ignored drop_control_dependency. 2021-10-15 21:14:12.107821: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:360] Ignored change_concat_input_ranges. 2021-10-15 21:14:12.108753: I tensorflow/cc/saved_model/reader.cc:38] Reading SavedModel from: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model 2021-10-15 21:14:12.181869: I tensorflow/cc/saved_model/reader.cc:90] Reading meta graph with tags { serve } 2021-10-15 21:14:12.181946: I tensorflow/cc/saved_model/reader.cc:132] Reading SavedModel debug info (if present) from: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model 2021-10-15 21:14:12.182174: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance. 2021-10-15 21:14:12.504754: I tensorflow/cc/saved_model/loader.cc:211] Restoring SavedModel bundle. 2021-10-15 21:14:13.431946: I tensorflow/cc/saved_model/loader.cc:195] Running initialization op on SavedModel bundle at path: C:\Users\godlo\Documents\TensorFlow\workspace\training_demo\exported-models\my_model\saved_model 2021-10-15 21:14:13.696451: I tensorflow/cc/saved_model/loader.cc:283] SavedModel load for tags { serve }; Status: success: OK. Took 1587684 microseconds. 2021-10-15 21:14:14.479694: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:210] disabling MLIR crash reproducer, set env var, tensor, tensor, tensor, tensor) -> (tensor<4xf32>) : {device = ""}
MLIR_CRASH_REPRODUCER_DIRECTORY
to enable. 2021-10-15 21:14:15.071035: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance. 2021-10-15 21:14:15.760240: W tensorflow/compiler/mlir/lite/flatbuffer_export.cc:1838] TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following flex op(s): Flex ops: FlexConcatV2 Details: tf.ConcatV2(tensor3. Steps to reproduce
I first go to Tensorflow Lite converter website (https://www.tensorflow.org/lite/convert) and the copied code below "Convert a SavedModel (recommended)" with slight changes.
By then I already have the .pb file ready to be converted, the save_model_dir is directing to the .pb file.
When I run the code, the error above is what I got
4. Expected behavior
I will get a .tflite file
5. System information