xuanandsix / CLRNet-onnxruntime-and-tensorrt-demo

This is the onnxruntime and tensorrt inference code for CLRNet: Cross Layer Refinement Network for Lane Detection (CVPR 2022). Official code: https://github.com/hongyliu/CLRNet
55 stars 6 forks source link

onnx inference time better than tensorrt inference time #6

Open mamadouDembele opened 2 years ago

mamadouDembele commented 2 years ago

Thanks for your amazing work. It seems the inference time of onnx model is better than the tensorrt model. Is there anything wrong with my testing ? I got 150 ms time inference for the onnx model and 770 ms for the tenssorrt model.

xuanandsix commented 2 years ago

It may be that Tensorrt consumes more time when it is started for the first time. The usual method is to use multiple times inference to get the average value. For example, ten times of inferences in my environment as follows,

83182B6B-193D-4156-9EC5-6BD467182DCA