Open Kangjik94 opened 2 years ago
if I have to convert torch model to tvm (or tensorRT), would you tell me some advices?
We have released the code for running our model on Jetson Nano with pre-built TVM binary in nano_demo. To convert the torch model to TVM binary, you may need to check the TVM Auto Scheduler Toturial.
hello, I tested your COCO and CROWDPOSE path.tar files using litepose/valid.py
but in my experience result, when using COCO trained LightPose-Auto-S, inference speed was 2 FPS.
is there some ways to speed up inference speed on Jetson Nano?
or...did I missed something? (like converting torch models to tvm)
when I tested litepose/nano_demo/start.py, using weight, FPS was almost 7.