asus4 / tf-lite-unity-sample

TensorFlow Lite Samples on Unity
869 stars 254 forks source link

Loading custom TFLite models via BaseImagePredictor is slow on Android platform #171

Closed KadirTilki closed 2 years ago

KadirTilki commented 3 years ago

Environment:

Describe the bug Loading custom TFLite models via BaseImagePredictor is very slow on Android platform compared to other platforms. I have timed each operation and found out that interpreter = TfLiteInterpreterCreate(model, options.nativePtr); call inside the Interpreter class is the cause. On Android devices (Pixel 4a, Samsung Galaxy Tab S4 & S6) that operation takes around 900 ms to complete and on an iPad Pro 2nd generation it only takes around 32 ms.

To Reproduce Steps to reproduce the behavior:

  1. Create an image predictor with a custom TFLite model (e.g., https://tfhub.dev/tensorflow/lite-model/mobilenet_v1_1.0_224/1/default/1)
  2. Deploy on an Android and an iOS device.
  3. Measure the time to create the image predictor on both devices.
  4. Compare the measurements.

Expected behavior Creating an image predictor with a custom TFLite model on Android platform should not be very slow compared to other platforms.

asus4 commented 3 years ago

Does it take the same amount of time in native TensorFlow Lite? If so, please open an issue to TensorFlow's GitHub.

If we change something on the Unity side, some Android are very slow to load files from SD cards, so making the model load asynchronously might improve it.

KadirTilki commented 3 years ago

Hello, thank you for the quick response. I tested a few custom tflite models on the TensorFlow example project and model loading took 25 ms at most.

Regarding your other point, the devices I tested with do not have an SD card. Also I have tried loading the models from different folders (Resources, StreamingAssets, Application.persistentDataPath) and there was not a visible time difference loading from these folders and all of them took around 900 ms.

asus4 commented 3 years ago

@KadirTilki Okay, it looks like an issue with this repo. Can you share your custom model here for testing?

KadirTilki commented 3 years ago

I could reproduce it with this model from TensorFlow Hub: https://tfhub.dev/tensorflow/lite-model/mobilenet_v1_1.0_224/1/metadata/1?lite-format=tflite

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.