microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.14k stars 2.86k forks source link

[Documentation] How to run this model on android mobile platform #20937

Open Vinaysukhesh98 opened 3 months ago

Vinaysukhesh98 commented 3 months ago

Describe the documentation issue

onnx package couldn't able to setup in termux is there any way to deploy model and run in android.

Page / URL

https://onnxruntime.ai/docs/genai/tutorials/phi3-v.html

skottmckay commented 3 months ago

The package works on Android. We test but running on the official Android emulator from the Android SDK as well as on actual devices.

Termux is not officially supported by ONNX Runtime. Given it's an Android terminal emulator to run a Linux shell I'm not sure there's a production use-case for it that would justify any work to try and support it.

You're welcome to try building the ORT library directly. If you're trying to run in a linux shell you'd want to build with the Linux instructions not the Android ones. https://onnxruntime.ai/docs/build/inferencing.html#cpu

github-actions[bot] commented 2 months ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.