linghu8812 / tensorrt_inference

705 stars 207 forks source link

`engine->getNbBindings() == 2' failed #69

Open Zejun-Yang opened 3 years ago

Zejun-Yang commented 3 years ago

您好,我已经按照linghu8812/tensorrt_inference/yolov5/README中所说的,下载https://github.com/ultralytics/yolov5,下载https://github.com/linghu8812/yolov5,然后把https://github.com/linghu8812/yolov5里面的export_onnx.py复制到https://github.com/ultralytics/yolov5下面,并转换官方的yolov5s.pt模型。 但是在tensorrt版本上依然出现了这个报错,请问是我哪一步理解出错了吗?

另外,我也尝试使用https://github.com/linghu8812/yolov5直接训练模型并使用export_onnx.py转换,然而出现了如下报错: Starting ONNX export with onnx 1.8.0... ONNX export failure: Exporting the operator silu to ONNX opset version 12 is not supported. Please open a bug to request ONNX export support for the missing operator. 导致转换失败。

十分感谢您的答复。

linghu8812 commented 3 years ago

参考这个测试一下onnx模型 https://github.com/linghu8812/tensorrt_inference/issues/12#issuecomment-745724887

Zejun-Yang commented 3 years ago

在https://github.com/ultralytics/yolov5下得到的onnx模型的输出结果是(1, 3, 80, 80, 85),请问是哪个环节出现问题了

Zejun-Yang commented 3 years ago

已经获得输出正确的onnx模型了,十分感谢您的帮助!!!需要用您提供的yolov5框架来转换模型,而不是官方的yolov5。

另外,您可以在README中修改一下开头部分的表述,可以参考如下:

2.Export ONNX Model git clone https://github.com/linghu8812/yolov5.git run export_onnx.py in https://github.com/linghu8812/yolov5/models to generate yolov5s.onnx and so on. attention, do not convert the model by using https://github.com/ultralytics/yolov5, otherwise you may get the wrong output size.

export PYTHONPATH="$PWD" && python3 models/export_onnx.py --weights ./weights/yolov5s.pt --img 640 --batch 1