Open Barshan-Mandal opened 3 months ago
it might be pretty hard... #1083
but there must be a way as we know that c# is not capable enough to do much things like torch
make an easy and seamless integration.you may use pytorch mobile
You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:
You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:
- Export your models as ONNX and use that for inference.
- Build Asp .net core server application hosting TorchSharp stuff and expose it to android/ios frontend.
but how does onnx exist for embedded devices?
You're not supposed to put torchlib on user devices, its too massive. You pretty much have two options:
- Export your models as ONNX and use that for inference.
- Build Asp .net core server application hosting TorchSharp stuff and expose it to android/ios frontend.
but how does onnx exist for embedded devices?
Something exists. For instance Unity already made two ONNX inference engines that run on mobile. The recent one is called Sentis, you can maybe use that? https://unity.com/products/sentis
TorchSharp supports the platforms that libtorch supports: the CPU backend on Windows X64, MacOS M1/M2/M3, Linux X64. In addition, both Windows and Linux support then libtorch CUDA backends.
I want to use torchsharp for android and ios .How can i do it?