XiaoMi / mobile-ai-bench

Benchmarking Neural Network Inference on Mobile Devices
Apache License 2.0
353 stars 57 forks source link

Support TFLITE's mobilenetv2, squeezenetv1.1 bench? #17

Closed ysh329 closed 5 years ago

ysh329 commented 5 years ago

Thanks in advance.

image

image

paper named SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size

lydoc commented 5 years ago

We will support TFLITE's mobilenetv2 and some quantized models later. But for the squeezenet, TFLITE supports squeezenet v1.0 officially which is different from squeezenet v1.1 supported in MACE/NCNN/SNPE, so we do not support squeezenet in TFLITE in this project.

ysh329 commented 5 years ago

@lydoc I wanna ask, squeezenetv1.1 model in bench is fp32 model? I want to bench fp32 models only. Thanks.

lydoc commented 5 years ago

@ysh329 Sorry, a slip of the tongue just now, I mean the tensorflow lite host only squeezenet v1.0 tflite model, if you want benchmark squeezenet v1.1 in TFLITE, you can follow these steps: https://github.com/XiaoMi/mobile-ai-bench#adding-a-model-to-run-on-existing-framework

ysh329 commented 5 years ago

@lydoc Greatly thanks. Besides, I saw many mobilenetv2 models in link you gave. Which mobilenetv2 is general baseline mobilenetv2, that bench used? 😿

lydoc commented 5 years ago

@ysh329 I guess what you might say is different versions of Mobilenet_V1. We use Mobilenet_V1_1.0_224.

ysh329 commented 5 years ago

Thanks. 🙇