Open Kmarconi opened 4 years ago
Hi, Same I am looking for using trt inference server?
I know to load and run engine using tensorrt sdk. Examples can be found when tensorrt SDK is installed.
Yeah loading engine is not the problem but how can i run inference and display the result of it is more what I'm looking for.
Hello, after convert to trt, save model. u can load model again and refer. u can refer to this file:. https://github.com/NVIDIA-AI-IOT/torch2trt/blob/master/torch2trt/test.py
Hi, I was wondering if anyone knew how can I use my torch2trt engine on images and get back the tensor with detections in it ? I know that i need to preprocess my image to send it to my model but after, what can I do ?
Thanks