Open yonatanbitton opened 3 years ago
Hi @yonatanbitton, I think you may convert the model to tflite or coreml format for deploying in Android and IOS (example: torch->onnx->tensorflow->tensorflow lite), then use the preprocessing, postprocessing code to inference. However, I don't highly recommend you, the two-stage model require a detector and it is not quick enough for realtime in mobile. Anyway, I haven't try it yet, so you can try for experience then let me know about the time. Thanks!
Is there any mobile implementation? Androis / iOS? Thanks