Closed ysh329 closed 6 years ago
We will support TFLITE's mobilenetv2 and some quantized models later. But for the squeezenet, TFLITE supports squeezenet v1.0 officially which is different from squeezenet v1.1 supported in MACE/NCNN/SNPE, so we do not support squeezenet in TFLITE in this project.
@lydoc I wanna ask, squeezenetv1.1 model in bench is fp32 model? I want to bench fp32 models only. Thanks.
@ysh329 Sorry, a slip of the tongue just now, I mean the tensorflow lite host only squeezenet v1.0 tflite model, if you want benchmark squeezenet v1.1 in TFLITE, you can follow these steps: https://github.com/XiaoMi/mobile-ai-bench#adding-a-model-to-run-on-existing-framework
@lydoc Greatly thanks. Besides, I saw many mobilenetv2 models in link you gave. Which mobilenetv2 is general baseline mobilenetv2, that bench used? 😿
@ysh329 I guess what you might say is different versions of Mobilenet_V1. We use Mobilenet_V1_1.0_224.
Thanks. 🙇
Thanks in advance.
paper named
SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size