I am trying to convert a TensorFlow Lite file trained on Edge Impulse to run on my Google Coral Micro. The model is MobileNetV2 SSD FPN-Lite 320x320 trained on custom images. The TF saved file is here.
With that, I converted it to TFLite using the following code:
# Load model
loaded_model = tf.saved_model.load(TF_MODEL_PATH)
# Get input dimensions
input_dims = loaded_model.signatures["serving_default"].inputs[0].shape
# Set default dimensions
input_dims = list(input_dims)
if None in input_dims:
print("None dimension found, setting defaults")
input_dims[0] = input_dims[0]
input_dims[1] = 320
input_dims[2] = 320
input_dims[3] = input_dims[3]
input_dims = tuple(input_dims)
# Construct a representative dataset for quantization
# NOTE: this works for images only (assume uniform 2D images). You
# will need to modify this for other types of data
def get_representative_dataset():
for _ in range(250):
data = np.random.uniform(0, 1.0, size=input_dims).astype(np.float32)
yield [data]
# Convert to quantized TFLite model
converter = tf.lite.TFLiteConverter.from_saved_model(TF_MODEL_PATH)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
converter.representative_dataset = get_representative_dataset
tflite_model = converter.convert()
with open(TFLITE_MODEL_PATH, "wb") as f:
f.write(tflite_model)
From there, I used the Edge TPU compiler to map most of the ops to the TPU:
Edge TPU Compiler version 16.0.384591198
Started a compilation timeout timer of 180 seconds.
Model compiled successfully in 888 ms.
Input model: ei-xrp-bucket-delivery.lite
Input size: 3.57MiB
Output model: ei-xrp-bucket-delivery_edgetpu.tflite
Output size: 3.89MiB
On-chip memory used for caching model parameters: 3.20MiB
On-chip memory remaining for caching model parameters: 4.40MiB
Off-chip memory used for streaming uncached model parameters: 0.00B
Number of Edge TPU subgraphs: 1
Total number of operations: 166
Operation log: ei-xrp-bucket-delivery_edgetpu.log
Model successfully compiled but not all operations are supported by the Edge TPU. A percentage of the model will instead run on the CPU, which is slower. If possible, consider updating your model to use only operations supported by the Edge TPU. For details, visit g.co/coral/model-reqs.
Number of operations that will run on Edge TPU: 112
Number of operations that will run on CPU: 54
See the operation log file for individual operation details.
Compilation child process completed within timeout period.
Compilation succeeded!
I then used the following code to run the compiled model on my Google Coral Micro.
The board gives me the following error:
Didn't find op for builtin opcode 'PACK' version '1'. An older version of this builtin might be supported. Are you using an old TFLite binary with a newer model?
Failed to get registration from op code PACK
ERROR: AllocateTensors() failed
I have tried the TFLite conversion process with TF versions 2.8.2, 2.11, and 2.14. They all result in the same error. From what I understand, the Pack operation is supported by TFLite.
Click to expand!
### Issue Type
Support
### Operating System
Linux
### Coral Device
Dev Board Micro
### Other Devices
_No response_
### Programming Language
C++
### Relevant Log Output
_No response_
Description
I am trying to convert a TensorFlow Lite file trained on Edge Impulse to run on my Google Coral Micro. The model is MobileNetV2 SSD FPN-Lite 320x320 trained on custom images. The TF saved file is here.
With that, I converted it to TFLite using the following code:
From there, I used the Edge TPU compiler to map most of the ops to the TPU:
which gave the following output:
I then used the following code to run the compiled model on my Google Coral Micro.
The board gives me the following error:
I have tried the TFLite conversion process with TF versions 2.8.2, 2.11, and 2.14. They all result in the same error. From what I understand, the Pack operation is supported by TFLite.
Click to expand!
### Issue Type Support ### Operating System Linux ### Coral Device Dev Board Micro ### Other Devices _No response_ ### Programming Language C++ ### Relevant Log Output _No response_