linghu8812 / tensorrt_inference

699 stars 205 forks source link

Convert yolov4 to ONNX with custom model #76

Closed MoaazAbdulrahman closed 3 years ago

MoaazAbdulrahman commented 3 years ago

Hello @linghu8812, Thank you for you great effort. I have a yolov4 model trained on custom dataset with 5 classes. Is it possible to convert it to tensorRT.

I have changed the number of classes in export_onnx.py for 80 to 5 and loaded my weigths and cfg files.

After conversion I get the following error:

loading filename from:../yolov4.trt
deserialize done
binding0: 283901952
binding1: 58222080
Processing: ../samples/bus.jpg
prepareImage
prepare image take: 5.57146 ms.
host2device
Segmentation fault (core dumped)

Is there anything else I have to do? Thanks

note: I have tested the pretrained model and works fine but my custom model gives me the above error.

linghu8812 commented 3 years ago

try change names file with yours https://github.com/linghu8812/tensorrt_inference/blob/463c6f06bee6b5570353d55fbd64b99832ce3e86/Yolov4/config-tiny.yaml#L4