Open martinmmi opened 10 months ago
Hello, iam martin.
I have problems to run an selfmade tensorflow lite custom model on the edgetpu.
Its a pretrained model on the coco dataset. I choosed the "ssd-mobilenet-v2-fpnlite-320" model for selfmade training.
I converted the model on this introduction: https://coral.ai/docs/edgetpu/compiler/#download
Everytime when i try to running the model with edgetpu, it runs only on the cpu. When i use a finish edge tpu from google:
https://dl.google.com/coral/canned_models/mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite
its working fine with a framerate between 18 and 20 fps.
What can i do?
I think its an unsupported data typ error problem durring the conversion.
I used this introduction for my custom models: https://github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/deploy_guides/Raspberry_Pi_Guide.md
Attached is my complete custom model: custom_model_lite.zip
Thanks so much!
Hello, iam martin.
I have problems to run an selfmade tensorflow lite custom model on the edgetpu.
Its a pretrained model on the coco dataset. I choosed the "ssd-mobilenet-v2-fpnlite-320" model for selfmade training.
I converted the model on this introduction: https://coral.ai/docs/edgetpu/compiler/#download
Everytime when i try to running the model with edgetpu, it runs only on the cpu. When i use a finish edge tpu from google:
https://dl.google.com/coral/canned_models/mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite
its working fine with a framerate between 18 and 20 fps.
What can i do?
I think its an unsupported data typ error problem durring the conversion.
I used this introduction for my custom models: https://github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/deploy_guides/Raspberry_Pi_Guide.md
Attached is my complete custom model: custom_model_lite.zip
Thanks so much!