lewes6369 / TensorRT-Yolov3

TensorRT for Yolov3
MIT License
489 stars 165 forks source link

use onnx model to inference #75

Open JensenHJS opened 4 years ago

JensenHJS commented 4 years ago

I can provide the onnx model. Have you planned to support onnx models with yolov3 currently? it's original model is yolov3.weights, using python API convert it to yolov3-608.onnx,then convert onnx to yolo-608.trt. Yolov3.weights is downloaded from darknet official website. https://drive.google.com/drive/folders/1fRcxY5YgEQ8DUmS1tEdDKxL45SsKkvFh?usp=sharing

lewes6369 commented 4 years ago

As I know, recently in TensorRT 6.0 the nvidia official has already provided a sample that running the onnx-yolov3 model. And it is written in python?

JensenHJS commented 4 years ago

thank you for your reply . i have solved my problem

ttdd11 commented 4 years ago

@JensenHJS in this repo they use a YOLO last layer (not the three convolution output layers). Were you able to get that into your onnx model?

JensenHJS commented 4 years ago

onnx doesn't support yolo layer. the output of trt file go through yolo layer,it is the final output.