FeiYull / TensorRT-Alpha

šŸ”„šŸ”„šŸ”„TensorRT for YOLOv8态YOLOv8-Pose态YOLOv8-Seg态YOLOv8-Cls态YOLOv7态YOLOv6态YOLOv5态YOLONAS......šŸš€šŸš€šŸš€CUDA IS ALL YOU NEED.šŸŽšŸŽšŸŽ
GNU General Public License v2.0
1.28k stars 198 forks source link

Trying to run YOLOv4 model on TensorRT-Alpha #7

Closed ZyLi99 closed 1 year ago

ZyLi99 commented 1 year ago

I am trying to use the YOLOv4 model provided in the TensorRT-Alpha repository, but I am encountering an error when trying to run the inference. Specifically, I am getting the following message: 'CUDA error: invalid device function'. I have double-checked that my CUDA version is compatible with the code and that my GPU is properly configured. I would appreciate any help in resolving this issue.

FeiYull commented 1 year ago

8 Perhaps the same situation, you can refer to the following.