microsoft / onnxruntime-training-examples

Examples for using ONNX Runtime for model training.
MIT License
311 stars 62 forks source link

[Training] ONNX Runtime Training: Can on-device learning be processed only in C/C++ or Python? #197

Closed Leo5050xvjf closed 1 month ago

Leo5050xvjf commented 1 month ago

Sorry, I am new to this area. I would like to ask if ORT training can be used for training and inference purely in Python or C/C++ on an arm64, Debian 12 environment?

I successfully ran model training and inference using a demo example(https://github.com/microsoft/onnxruntime-training-examples/blob/master/on_device_training/mobile/android/c-cpp/train.ipynb) on my PC (Ubuntu 20.04, x86). Now, I want to use the CPU to perform training and inference on my development board (arm64, Debian 12), but I am unable to install the package using pip install onnxruntime-training. When checking the issue, I found that the newer versions of onnxruntime are no longer updated on PyPI, so I could only download from https://github.com/Microsoft/onnxruntime/releases/tag/v1.19.2, the onnxruntime-training-linux-aarch64-1.19.2.tgz.

Does this mean that I can only use C/C++ as the API to complete on-device training?

Looking forward to your reply! Thank you!

baijumeswani commented 1 month ago

Hi @Leo5050xvjf. We do not have published onnxruntime-training whls for linux aarch64. You could consider building from source if you want to use Python. If you only need the C++ libs, you can use the package that you have linked in the description.

Leo5050xvjf commented 1 month ago

Hi @Leo5050xvjf. We do not have published onnxruntime-training whls for linux aarch64. You could consider building from source if you want to use Python. If you only need the C++ libs, you can use the package that you have linked in the description.

Okay, I understand now. Thank you for the response!👍