eric612 / MobileNet-SSD-windows

Other
92 stars 56 forks source link

Is the inference time of MobileNetv2 smaller than v1??? #4

Open yuanze-lin opened 6 years ago

yuanze-lin commented 6 years ago

Hello, in this project, is MobileNetv2 more faster? if it's this situation, then what's the fps of v2???

yiran-THU commented 6 years ago

I have tested. The inference time of MobileNetv2 is about 19ms, and MobileNetv1 is about 11ms.

yuanze-lin commented 6 years ago

@yiran-THU Thank you for your response, but I read the paper, paper mentioned that the inference time of MibileNet v2 would be less than MobileNet v1, are there any other versions of MobileNet v2??

eric612 commented 6 years ago

The original implement was come from here : https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md The inference time of v2 was 27 ms and v1 was 31 ms , but it is base on tensorflow framework

In this project , the deploy model of mobilenet-v1 was made by merge.py , which combined batch norm and scale layers into convolution layers , so it will be faster https://github.com/chuanqi305/MobileNet-SSD/blob/master/merge_bn.py

yuanze-lin commented 6 years ago

@eric612 Hi, so if you use merge.py deploying the model of mobilenet-v2, is it possible to be faster?