triple-Mu / YOLOv8-TensorRT

YOLOv8 using TensorRT accelerate !
MIT License
1.3k stars 225 forks source link

检测模型推理失败! #172

Closed tendrillion closed 10 months ago

tendrillion commented 10 months ago

根据readme配置环境,转换模型 YOLOv8-TensorRT/infer-det.py 推理成功。 yolov8-tensorrt/src/detect/end2end/main.cpp 也能推理成功。

但是使用YOLOv8-TensorRT/csrc/detect/normal/main.cpp推理失败,运行到: yolov8-tensorrt/src/detect/normal/include/yolov8.hpp的242行:
output = output.t();

报错,报错信息: terminate called after throwing an instance of 'cv::Exception' what(): OpenCV(4.7.0-dev) /usr/local/opencv-4.7.0/modules/core/src/matrix_expressions.cpp:24: error: (-5:Bad argument) Matrix operand is an empty matrix. in function 'checkOperandsExist'

OS:ubuntu 20.04 cuda:11.6

tendrillion commented 10 months ago

使用YOLOv8-TensorRT/csrc/detect/normal/main.cpp推理是,需要使用下面的指令导出模型: yolo export model=yolov8s.pt format=onnx opset=11 simplify=True

triple-Mu commented 10 months ago

Ok.