Open knn1989 opened 1 week ago
This is one area the team is exploring. @JacobSzwejbka could comment more
It would be greatly appreciated if this feature could be supported. I have attempted numerous times to transition from using TensorFlow Lite to PyTorch for edge device deployment, but the absence of features like this has hindered my efforts. Thank you.
I'm quite curious, usually training unlike inferencing expect much more computing, why does on-device training helpful? Is it because of that reduces the develop iteration cycle for training smaller model?
Its under active development, and we should have something to share soon. From your side @knn1989 can you tell me a little bit about what your use case looks like?
Does Executorch support on-device training (online learning/model update on edge devices)? If yes, how to enable this? Thanks.