robmarkcole / coral-pi-rest-server

Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
https://coral.ai/products/accelerator
MIT License
67 stars 20 forks source link

Try out efficient net #31

Open robmarkcole opened 5 years ago

robmarkcole commented 5 years ago

Promise to be more performant https://coral.withgoogle.com/models

robmarkcole commented 5 years ago

Turns out you need to create the model before using it, but not clear so sent an email to Manoj

Your model needs to be qunatized. Try the quantized model compiled for Edge TPU: https://coral.withgoogle.com/models/

It is recommended to go through our documents. for example, this page has a nice summary of the workflow: https://coral.withgoogle.com/docs/edgetpu/models-intro/

grinco commented 2 years ago

I'm testing EfficientDet-Lite3 and EfficientDet-Lite3x now. 3x is probably not going to work for everyone (requires M2 TPU) so will be focusing on the EfficientDet-Lite3. Wasn't satisfied with the SSD MobileNet V2 model performance - too many false positives.