chaquo / chaquopy

Chaquopy: the Python SDK for Android
https://chaquo.com/chaquopy/
MIT License
856 stars 132 forks source link

Support for onnx and onnx-tf #216

Open gotoorder opened 4 years ago

gotoorder commented 4 years ago

I need convert models from ONNX to Tensorflow. But I cannot find onnx packages. pls add it , thanks!

mhsmith commented 4 years ago

Thanks for the request. This package contains native components, so it would have to be built into a wheel file. If you'd like to try doing this yourself, follow the instructions here. And if you're successful, please make a pull request so we can add the package to the public repository.

If anyone else wants this package too, let us know by clicking the thumbs-up button above.

liuwenhua6666 commented 2 years ago

now I cannot find onnxruntime packages

mhsmith commented 2 years ago

If you have an existing onnx model you want to use with Chaquopy, you can try the following:

istvmunkacsi commented 2 years ago

Is support for onnx planned in near future?

mhsmith commented 2 years ago

Not in the near future, sorry. But you can always try building it yourself as mentioned above.

BentiGorlich commented 1 year ago

I have been trying to build the onnxruntime for days now, but I am getting nowhere. Their build script is aparently able to output wheel files, but I just can't get it to work. Is somebody here that is able to do that? I am not at all familiar with C and C++ build scripts and compilers, etc...

bouchnam commented 4 months ago

Hello, I am wondering if there have been any updates regarding the support for ONNX/ONNXRuntime in Chaquopy. Our current project requires the use of ONNX models, and we are unable to convert these models to TensorFlow due to compatibility issues with tree models in the onnx2tf package. Adding ONNX/ONNXRuntime support would greatly benefit our development process. Thanks!

mhsmith commented 4 months ago

Sorry, there's no update. But we do also support PyTorch (version 1.8.1) and TensorFlow Lite (version 2.5.0) – could you convert your model to one of those formats?

s16exe commented 1 month ago

Is support for onnx planned in near future? are the wheel files in the link, compatible with chaquopy?? https://www.wheelodex.org/projects/onnxruntime/

mhsmith commented 1 month ago

Sorry, the status is is still the same as in my previous comment.

briliantnugraha commented 1 month ago

Hi @mhsmith, thanks for you guys great work for "Pythonizing" android, which I really like TBH.

Btw, could I know, does pytorch or tflite inference with Chaquopy utilizes GPU? or is it CPU only? And is the model performance similar to native tflite in Flutter/Kotlin (ignore the postprocess)

Thanks in advance

mhsmith commented 1 month ago

We've made no attempt to enable GPU support for these Python packages, so they're probably CPU-only. This means they may have worse performance than the official Android tflite packages for Java/Kotlin, but how much worse will depend on your application.

I tried an official tflite image classification demo for Android a few years ago, and I think the GPU mode was about twice as fast as the CPU mode. But things could have changed a lot since then.