HaloTrouvaille / YOLO-Multi-Backbones-Attention

Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention
493 stars 118 forks source link

where in your code CBAM and eca blocks are defined. #16

Closed zunairaR closed 3 years ago

lucasjinreal commented 4 years ago

Same question here, and which where does it inserted into?

HaloTrouvaille commented 4 years ago

sorry for late response,please add my wechat to get former version.

lucasjinreal commented 4 years ago

@HaloTrouvaille what's your wechat?

ouening commented 4 years ago

您好,请问可以可以在该库中添加CBAM和CAE的支持吗?

HaloTrouvaille commented 3 years ago

wechat : AutrefoisLethe

Vrishabhdhwaj commented 2 years ago

I have a similar doubt. But I cannot access WeChat. Any other way you can share these files?