TrojanXu / yolov5-tensorrt

A tensorrt implementation of yolov5: https://github.com/ultralytics/yolov5
Apache License 2.0
190 stars 46 forks source link

How to run inference #1

Closed makaveli10 closed 4 years ago

makaveli10 commented 4 years ago

So, I tried converting onnx model to TRT using trtexec but that was not successful had some issues. Came across this work from you. I was curious to know if you had run any inference on the TRT engine generated through this repo. If yes please share the results or code if possible. THanks

TrojanXu commented 4 years ago

Yes, run the inference using main.py in this repo, you can get an exported onnx file. (using torch1.4.0+onnx1.6.0, this is important, otherwise, you might run into error with trtexec) With this onnx file, I think you can feed it to trtexec. But this onnx doesn't has nms now.

makaveli10 commented 4 years ago

So, this repo is just for generating onnx model? I did not see any code regarding NMS in main.py is the inference running without NMS for now?

Also Do you plan on including NMS in the model itself? If that is possible?

TrojanXu commented 4 years ago

yes, inference is without nms for now, I plan to implement it as a separated engine. The onnx model is generated and stored locally. Engine will be created in the script but not stored locally.

makaveli10 commented 4 years ago

@TrojanXu Great thanks! Waiting for NMS to be implemented. Thanks