HaloTrouvaille / YOLO-Multi-Backbones-Attention

Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention
493 stars 118 forks source link

“activation=linear” #25

Open VegetablesBird6 opened 3 years ago

VegetablesBird6 commented 3 years ago

“activation=linear” appears in cfg files, but there is no description in “models.py”. Can you tell me what to do?