Open yrik opened 1 year ago
Archive.zip I've attached the .tflite and _edtetpu.tflite versions of the model.
Did you have any luck with this @yrik I'm having the same thing. I've seen that some of this is related to power issues, but I have my coral plugged into a desktop with full USB-powered ports and am getting the same error
Did you have any luck with this @yrik I'm having the same thing. I've seen that some of this is related to power issues, but I have my coral plugged into a desktop with full USB-powered ports and am getting the same error
I've realized that coral tpu does not support bigger models, so if you have many classes or higher resolution, it will crash. For me, 2 classes with 640x640 were working fine but with 80 classes it was crashing.
That same happens to me. By the way, @yrik, how you were trained model to work (code in Python, or using yolo-cli)? Help is appriciated. Thanks in advance.
So while reading other issues, I also discovered that it's not problem with classes (as the @yrik said), but just one thing: the image size. As YOLOv8 is large model (even if used YOLOv8n), the computation power and loading is not enough to start the processing. This is just from reading other issues. The key for it to work is:
To convert model to edgetpu
format and achieve proper results without crashing while inference with TPU, just run this command (uses YOLO-CLI and YOLOv8n pre-trained model):
yolo export model=yolov8n.pt format=edgetpu imgsz=192,192
I hope this helps for beginners.
Description
I've compiled a default Yolov8n model for edgetpu but it's crashing when I try inferencing.
I'm running it on raspberry 4 with coral tpu dongle
Issue Type
Bug
Operating System
Linux
Coral Device
USB Accelerator
Other Devices
Rapsberry Pi 4
Programming Language
Python 3.9
Relevant Log Output