Closed Amanpradhan closed 5 years ago
Here are some suggestions that might be useful: 1.If your target device supports Android NNAPI and you can use SOC hardware acceleration, you can get higher speed by using Tensorflow Lite instead of Tensorflow Mobile.Quantizing to FP16 or INT8 models will lose some precision but get a great speed boost. 2.Use a newer model. SSD MobileNet V2 is faster than SSD MobileNet V1, and SSD MNASNet is faster than SSD MobileNet V2, with close accuracy. 3.Use SSDLite instead of SSD. 4.Setting a lower input resolution and depth_multiplier will reduce the accuracy, but also speed.
Can I quantize my model using Tensorflow Mobile instead of Tensorflow Lite?
Only the Tensorflow Lite run has support for running quantized operations. Tensorflow Mobile is slated for deprecation soon.
Closing this issue since its resolved. Feel free to reopen if have any further questions. Thanks!
Here are some suggestions that might be useful: 1.If your target device supports Android NNAPI and you can use SOC hardware acceleration, you can get higher speed by using Tensorflow Lite instead of Tensorflow Mobile.Quantizing to FP16 or INT8 models will lose some precision but get a great speed boost. 2.Use a newer model. SSD MobileNet V2 is faster than SSD MobileNet V1, and SSD MNASNet is faster than SSD MobileNet V2, with close accuracy. 3.Use SSDLite instead of SSD. 4.Setting a lower input resolution and depth_multiplier will reduce the accuracy, but also speed.
SSDLite speed will over ssd quantized model speed?
Please go to Stack Overflow for help and support:
http://stackoverflow.com/questions/tagged/tensorflow
Also, please understand that many of the models included in this repository are experimental and research-style code. If you open a GitHub issue, here is our policy:
Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.
System information
You can collect some of this information using our environment capture script:
https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh
You can obtain the TensorFlow version with
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Describe the problem
I am running mobilelet ssd object detector and while its performance is fine, its speed is on the lower side. What can I do to increase the detection speed in android?