Hi!
I trained model detector and converted it to TFLite.
After it I checked performance of the model on mobile phone with Android using benchmark from official Tensorflow site https://www.tensorflow.org/lite/performance/measurement#download_or_build_the_binary and got average speed at about 20 ms.
But when I inserted my model to your object detection example using your flutter library ver0.9.0 and ran predict function, I got inference speed at about 60 ms (3 times slower).
Do you have any ideas, why there is so big difference?
Hi! I trained model detector and converted it to TFLite. After it I checked performance of the model on mobile phone with Android using benchmark from official Tensorflow site https://www.tensorflow.org/lite/performance/measurement#download_or_build_the_binary and got average speed at about 20 ms. But when I inserted my model to your object detection example using your flutter library ver0.9.0 and ran predict function, I got inference speed at about 60 ms (3 times slower).
Do you have any ideas, why there is so big difference?