AlexeyAB / darknet

YOLOv4 / Scaled-YOLOv4 / YOLO - Neural Networks for Object Detection (Windows and Linux version of Darknet )
http://pjreddie.com/darknet/
Other
21.63k stars 7.95k forks source link

Export darknet weights to ONNX #7002

Open linghu8812 opened 3 years ago

linghu8812 commented 3 years ago

Hello everyone, here is a code that can convert the weights of darknet to onnx model. Now it can support the conversion of yolov4, yolov4-tiny, yolov3, yolov3-spp and yolov3-tiny darknet models. The transformed model can be transformed into tensorrt model based on onnx parser of tensorrt for inference acceleration. There are also codes for tensorrt inference in the repo.

Code file: https://github.com/linghu8812/tensorrt_inference/blob/master/project/Yolov4/export_onnx.py

After the yolo layer, transpose and concat layers are added to make the post processing of the model easier. The structure of the transpose layer is shown in the following figure:

AlexeyAB commented 3 years ago

@linghu8812 Hi, Did you try to port new model https://github.com/AlexeyAB/darknet/issues/6987#issuecomment-728923404

with some changes: https://github.com/AlexeyAB/darknet/issues/6987#issuecomment-729218069

linghu8812 commented 3 years ago

@AlexeyAB Hi, I have tested, now the export_onnx.py can export yolov4x-mish onnx model. For the new feature new_coords, I add a sigmoid layer after the concat layer, which is also can make post processing easier. The structure of the model is shown below:

image

run this command can export the yolov4x-mish.onnx model

python3 export_onnx.py --cfg_file cfg/yolov4x-mish.cfg --weights_file yolov4x-mish.weights --output_file yolov4x-mish.onnx --sigmoid

I have tested the yolov4x-mish.onnx with my tensorrt inference code, the result is shown below, the scores of the output objects are consistent with darknet outputs:

image

dsbyprateekg commented 3 years ago

@linghu8812 can you please share how have you created the graphs you have mentioned above?

linghu8812 commented 3 years ago

@dsbyprateekg Hi, the code is here: https://github.com/linghu8812/tensorrt_inference/blob/master/project/Yolov4/export_onnx.py

xiaochus commented 3 years ago

@linghu8812 The generated onnx model can't used in MNN framework on mobile device, because unsupport onnx operation auto_pad='SAME_LOWER'

linghu8812 commented 3 years ago

@linghu8812 The generated onnx model can't used in MNN framework on mobile device, because unsupport onnx operation auto_pad='SAME_LOWER'

@xiaochus it may caused by make conv node, however, I am not familiar with MNN, if I know which operation MNN can supported, I can modify it.