Closed KadirTilki closed 2 years ago
Does it take the same amount of time in native TensorFlow Lite? If so, please open an issue to TensorFlow's GitHub.
If we change something on the Unity side, some Android are very slow to load files from SD cards, so making the model load asynchronously might improve it.
Hello, thank you for the quick response. I tested a few custom tflite models on the TensorFlow example project and model loading took 25 ms at most.
Regarding your other point, the devices I tested with do not have an SD card. Also I have tried loading the models from different folders (Resources, StreamingAssets, Application.persistentDataPath) and there was not a visible time difference loading from these folders and all of them took around 900 ms.
@KadirTilki Okay, it looks like an issue with this repo. Can you share your custom model here for testing?
I could reproduce it with this model from TensorFlow Hub: https://tfhub.dev/tensorflow/lite-model/mobilenet_v1_1.0_224/1/metadata/1?lite-format=tflite
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Environment:
Describe the bug Loading custom TFLite models via BaseImagePredictor is very slow on Android platform compared to other platforms. I have timed each operation and found out that
interpreter = TfLiteInterpreterCreate(model, options.nativePtr);
call inside the Interpreter class is the cause. On Android devices (Pixel 4a, Samsung Galaxy Tab S4 & S6) that operation takes around 900 ms to complete and on an iPad Pro 2nd generation it only takes around 32 ms.To Reproduce Steps to reproduce the behavior:
Expected behavior Creating an image predictor with a custom TFLite model on Android platform should not be very slow compared to other platforms.