Open Vinaysukhesh98 opened 3 months ago
The package works on Android. We test but running on the official Android emulator from the Android SDK as well as on actual devices.
Termux is not officially supported by ONNX Runtime. Given it's an Android terminal emulator to run a Linux shell I'm not sure there's a production use-case for it that would justify any work to try and support it.
You're welcome to try building the ORT library directly. If you're trying to run in a linux shell you'd want to build with the Linux instructions not the Android ones. https://onnxruntime.ai/docs/build/inferencing.html#cpu
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
Describe the documentation issue
onnx package couldn't able to setup in termux is there any way to deploy model and run in android.
Page / URL
https://onnxruntime.ai/docs/genai/tutorials/phi3-v.html