Open gotoorder opened 4 years ago
Thanks for the request. This package contains native components, so it would have to be built into a wheel file. If you'd like to try doing this yourself, follow the instructions here. And if you're successful, please make a pull request so we can add the package to the public repository.
If anyone else wants this package too, let us know by clicking the thumbs-up button above.
now I cannot find onnxruntime packages
If you have an existing onnx model you want to use with Chaquopy, you can try the following:
tflite-runtime
or tensorflow
packages.Is support for onnx planned in near future?
Not in the near future, sorry. But you can always try building it yourself as mentioned above.
I have been trying to build the onnxruntime for days now, but I am getting nowhere. Their build script is aparently able to output wheel files, but I just can't get it to work. Is somebody here that is able to do that? I am not at all familiar with C and C++ build scripts and compilers, etc...
Hello, I am wondering if there have been any updates regarding the support for ONNX/ONNXRuntime in Chaquopy. Our current project requires the use of ONNX models, and we are unable to convert these models to TensorFlow due to compatibility issues with tree models in the onnx2tf package. Adding ONNX/ONNXRuntime support would greatly benefit our development process. Thanks!
Sorry, there's no update. But we do also support PyTorch (version 1.8.1) and TensorFlow Lite (version 2.5.0) – could you convert your model to one of those formats?
Is support for onnx planned in near future? are the wheel files in the link, compatible with chaquopy?? https://www.wheelodex.org/projects/onnxruntime/
Sorry, the status is is still the same as in my previous comment.
Hi @mhsmith, thanks for you guys great work for "Pythonizing" android, which I really like TBH.
Btw, could I know, does pytorch or tflite inference with Chaquopy utilizes GPU? or is it CPU only? And is the model performance similar to native tflite in Flutter/Kotlin (ignore the postprocess)
Thanks in advance
We've made no attempt to enable GPU support for these Python packages, so they're probably CPU-only. This means they may have worse performance than the official Android tflite packages for Java/Kotlin, but how much worse will depend on your application.
I tried an official tflite image classification demo for Android a few years ago, and I think the GPU mode was about twice as fast as the CPU mode. But things could have changed a lot since then.
I need convert models from ONNX to Tensorflow. But I cannot find onnx packages. pls add it , thanks!