triple-Mu / YOLOv8-TensorRT

YOLOv8 using TensorRT accelerate !
MIT License
1.32k stars 228 forks source link

All zero outputs only when running on Jetson Xavier NX #88

Closed zhifeis closed 1 year ago

zhifeis commented 1 year ago

I first tried to convert and run my yolov8 model on Colab according to the steps in this repository and it worked perfectly. However, when I replicated the same steps on my Jetson Xavier NX device everything would run without error but the output is entirely zeros (i.e. no detections) when the same image produced high-confidence and accurate detections on Colab and on the base .pt model. Anyone know what's up with this?

triple-Mu commented 1 year ago

I first tried to convert and run my yolov8 model on Colab according to the steps in this repository and it worked perfectly. However, when I replicated the same steps on my Jetson Xavier NX device everything would run without error but the output is entirely zeros (i.e. no detections) when the same image produced high-confidence and accurate detections on Colab and on the base .pt model. Anyone know what's up with this?

What's you trt version? And suggest you convert pt to onnx on Colab and copy onnx to your Jetson, Then build engine by trtexec tools.

zhifeis commented 1 year ago

Yeah converting to onnx on Colab seemed to work for me, any idea why this is the case?

triple-Mu commented 1 year ago

Suggesting you using jetpack 4.6.3. I just test on it on my TX2