jkjung-avt / tensorrt_demos

TensorRT MODNet, YOLOv4, YOLOv3, SSD, MTCNN, and GoogLeNet
https://jkjung-avt.github.io/
MIT License
1.75k stars 547 forks source link

Error opening engine file with ./trtexec #528

Closed lpkoh closed 2 years ago

lpkoh commented 2 years ago

Hi, I am trying to test the speed of the trt models.

I ran yolo_to_onnx.py and onnx_to_tensorrt.py and successfully ran eval_yolo.py and saw the fps and bounding boxes.

However, I ran ./trtexec --loadEngine= + other settings for inference phase (reference: https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#trtexec-serialized-timing-cache) to obtain a different fps estimate and I got "Error opening engine file".

image

Likewise, when I use the trt files I obtain from converting darknet models to onnx using tianxiaomo's directory, then convert those to trt using ./trtexec, those models are not usable in this repo.

image

Could you clarify on this issue?

Also a separate clarification, is the conversion in onnx_to_trt by default fp16?

jkjung-avt commented 2 years ago

@lpkoh The TensorRT YOLO engines built with this repo would contain the "yolo_layer" plugin. So you'll have to call trtexec with something like the following.

$ ./trtexec --loadEngine=yolo/yolov4-tiny.trt --plugins=plugins/libyolo_layer.so