triple-Mu / YOLOv8-TensorRT

YOLOv8 using TensorRT accelerate !
MIT License
1.35k stars 234 forks source link

Inference boxes not visible on frame #259

Open Mayuresh999 opened 6 days ago

Mayuresh999 commented 6 days ago

hey @triple-Mu can you guide me with running yolov8 inference with tensorrt cpp on jetson?. I have a Jetson Orin NX 16 GB with following config: Jetpack - 5.1.4 CUDA - 11.4.315 cuDNN - 8.6.0.166 TensorRT - 8.5.2.2 OpenCV - 4.6.0 with CUDA

I have tried the method in the repo for jetson both using python and c++. The c++ method works without any error log, i can see the frame as well and GPU is is also getting utilized but, the frame has no bboxes plotted. the model I am using is a simple pretrained yolov8s which I have converted to ONNX and then to tensorrt engine on the same device.

triple-Mu commented 2 hours ago

hey @triple-Mu can you guide me with running yolov8 inference with tensorrt cpp on jetson?. I have a Jetson Orin NX 16 GB with following config: Jetpack - 5.1.4 CUDA - 11.4.315 cuDNN - 8.6.0.166 TensorRT - 8.5.2.2 OpenCV - 4.6.0 with CUDA

I have tried the method in the repo for jetson both using python and c++. The c++ method works without any error log, i can see the frame as well and GPU is is also getting utilized but, the frame has no bboxes plotted. the model I am using is a simple pretrained yolov8s which I have converted to ONNX and then to tensorrt engine on the same device.

Can you give me much detail logs? Such as build command, convert log, onnx model or pytorch model, test image?