HaloTrouvaille / YOLO-Multi-Backbones-Attention

Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention
493 stars 118 forks source link

why don't you add code to forward in create_module? #22

Open xiaomahencai opened 3 years ago

xiaomahencai commented 3 years ago

do we need to add code in forward in create_module?Thank you your help!

HaloTrouvaille commented 3 years ago

If you need previous model with attention, please add my wechat