Open VicHofs opened 2 weeks ago
Hi @VicHofs ,
Apologies for the late response. For mobile devices, it is not recommended to use very large models because they can consume a significant amount of memory and potentially cause OutOfMemoryError
issues. Memory management on Android and iOS different. You might find this article on Android vs. iOS memory management insightful.
Instead of loading a large model directly on the browser, you can convert your model to TFLite
format using quantization. This can significantly reduce the model size and ram usage. You can then use the @tensorflow/tfjs-tflite
library to load and run the TFLite
model in the browser.
Also you mentioned that you've successfully loaded a smaller model (~7MB) on both devices but it compromises accuracy. Could you please share more details about how you are compressing your model?
Please let me know if there's anything I have missed.
Thank You!!
Hi @shmishra99,
Thanks for the response!
Can I load TFLite
models for inference on React Native? I'm unsure if I discarded this option earlier because it was not plausible or not, but I know I've considered it in the past.
The smaller model is just another model I trained with fewer parameters and layers for the same purpose, so it is expected to have lower accuracy.
Yes, you can load a tflite model in your react-native application. There are many examples available. You can check out the tflilte react-native examples here.
It's generally better to use a quantized model than one with fewer trained parameters. A quantized model will often perform better, especially on mobile devices.
Thank You!!
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
System information
@tensorflowjs/tfjs
+@tensorflowjs/tfjs-react-native
4.20.0
3.18.0
Issue
I am trying to import a custom graph model around 50 MB in size with
bundleResourceIO
(@tensorflow/tfjs-react-native
) andloadGraphModel
(@tensorflowjs/tfjs
) for inference. Building the app withexpo run:android
successfully builds and installs it on the simulator, but the running app causes it to crash immediately with the following error message:The model loads on the iOS build and causes no other issues. I can load a smaller model (~7MB) with no problem on both platforms, but accuracy suffers significantly.
Minimal example
Here is a repo that replicates this issue. I've omitted my model, so some setup is required, as described in the
README.md
file.Here is the logcat output from running the repo above: