Open vyang2968 opened 2 years ago
Hello @vyang2968 can you please share what type of model network you are trying to train and the model input size. Thanks!
@hjonnala, so far I have worked with EfficientDet-Lite and SSDLite. I have also simply trained SSD MobileNet V2 FPNLite 320x320 with the Tensorflow 2 Object Detection API and attempted to quantize it, but it has failed thus far. Preferably, I would like to use TF2 OD API and get a quantized model that I can use on the EdgeTPU, but that support does not look like it is coming anytime soon. As of now, I am working with SSD MobileNet V1 which looks promising, although it is a bit confusing. P
Description
When will we ever be able to quantize a custom-trained Tensorflow model and make it as fast as the pretrained models that are supplied by Google Coral? All the resources provided by Google Coral are severely lacking with FPS under 10 and over 200 ms of delay. I have been struggling with this a while.
Click to expand!
### Issue Type Feature Request ### Operating System Mendel Linux ### Coral Device Dev Board ### Other Devices _No response_ ### Programming Language Python 3.7 ### Relevant Log Output _No response_