linghu8812 / tensorrt_inference

705 stars 207 forks source link

How to speed up inferencing? #66

Open leeyunhome opened 3 years ago

leeyunhome commented 3 years ago

Hello,

I know that the factors that affect the inference speed are image size, epoch, and batch size.

Have you ever used an nvidia edge device called jetson nano?

If you ever run this project on this jetson nano What poetry can you try to speed up your reasoning?

Thank you.