Open yuanze-lin opened 6 years ago
I have tested. The inference time of MobileNetv2 is about 19ms, and MobileNetv1 is about 11ms.
@yiran-THU Thank you for your response, but I read the paper, paper mentioned that the inference time of MibileNet v2 would be less than MobileNet v1, are there any other versions of MobileNet v2??
The original implement was come from here : https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md The inference time of v2 was 27 ms and v1 was 31 ms , but it is base on tensorflow framework
In this project , the deploy model of mobilenet-v1 was made by merge.py , which combined batch norm and scale layers into convolution layers , so it will be faster https://github.com/chuanqi305/MobileNet-SSD/blob/master/merge_bn.py
@eric612 Hi, so if you use merge.py deploying the model of mobilenet-v2, is it possible to be faster?
Hello, in this project, is MobileNetv2 more faster? if it's this situation, then what's the fps of v2???