Open vgoklani opened 2 years ago
@vgoklani, great questions!
TorchScript
ed models to perform inference. The lite interpreter runtime has limited support for GPU (Vulkan/Metal) and NPU (CoreML/NNAPI)TorchScript
ed models for GPU/NPU are available (currently the selection is very limited, e.g., MobileNet V2)react-native-pytorch-core
), it's possible to debug and profile the app. Basically, when the app is build in debug mode, any profiler/instrument used for Android/iOS debgging/profiling can be used. Note: if you use the "pre-built PlayTorch app," then debugging and profiling is not supportedjsi.h
is a good starting point.Hope this helps
Hello,
Thanks for creating this project! I had a few questions around mobile performance, and more specifically, mobile hardware utilization.
Thanks!
cc: @raedle