NVIDIA-AI-IOT / yolo_deepstream

yolo model qat and deploy with deepstream&tensorrt
Apache License 2.0
533 stars 135 forks source link

Plugin is not working on TX2 NX for Yolo V4 #23

Open caruofc opened 2 years ago

caruofc commented 2 years ago

Hi,

I am using Jetpack 4.6. I tried both the prebuild libnvinfer_plugin.so.8.0.1 from this repo (https://github.com/NVIDIA-AI-IOT/deepstream_tao_apps/tree/master/TRT-OSS/Jetson/TRT8.0) and I also tried building the plugin by myself. Both show me the following error.

[04/25/2022-16:52:55] [E] [TRT] 2: [pluginV2Runner.cpp::execute::267] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed.) &&&& FAILED TensorRT.sample_yolo [TensorRT v8001] # ../bin/yolov4 --fp16

Jetpack 4.6 comes with: TensorRT 8.0.1 CuDNN 8.2.1 CUDA 10.2

Need help. Thanks,