linghu8812 / tensorrt_inference

696 stars 205 forks source link

error in run inference #173

Closed bingnoi closed 8 months ago

bingnoi commented 8 months ago
./tensorrt_inference yolov5 ../configs/yolov5/config.yaml ../samples/detection_segmentation
[11/05/2023-16:18:50] [W] [TRT] Unable to determine GPU memory usage
[11/05/2023-16:18:50] [W] [TRT] Unable to determine GPU memory usage
[11/05/2023-16:18:50] [I] [TRT] [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 5, GPU 0 (MiB)
[11/05/2023-16:18:50] [W] [TRT] CUDA initialization failure with error: 35. Please check your CUDA installation:  http://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html
tensorrt_inference: /datadisk2/lixinhao/tensorrt_inference/code/src/model.cpp:32: void Model::OnnxToTRTModel(): Assertion `builder != nullptr' failed.
Aborted (core dumped)
bingnoi commented 8 months ago

the reason is cuda not compatible with tensorrt