Open ChenJian7578 opened 1 year ago
@ChenJian7578 the inserted position is the same as all comparison schemes, such as NAM, CoordAttention, and YOLOv5x
emm... Using v5 as an example, can you explain in detail where you add the attention module? this might provide some reference for my current work on increasing the performance of the network on its own data set by adding attention modules
Attention modules are usually placed on the deep network layer. In our paper, all attention modules are inserted into the last two C3 layers (Behind C3 layers) of yolov5s Backbone (I think the feature expression performance of shallow feature layer is strong enough, it can do without attention mechanism). More information you can refer to [yolov5_research] (https://github.com/positive666/yolov5_research).
May I ask if the source code for EMA attention module is publicly available?
@YOLOonMe Hi,thank you very much for your reply, but I have not found a standalone EMA module in it. Do you currently have a plug and play module?
@YOLOonMe Hello, I also need an ema module code to add to my network. Can you add a file with only an ema module?Thank you.
Attention modules are usually placed on the deep network layer. In our paper, all attention modules are inserted into the last two C3 layers (Behind C3 layers) of yolov5s Backbone (I think the feature expression performance of shallow feature layer is strong enough, it can do without attention mechanism). More information you can refer to [yolov5_research] (https://github.com/positive666/yolov5_research).
In YOLO, EMA_attention if placed behind the detection head. What should be the factor factor set for grouping? If the output is 64 or 32 channels, does the factor still need to be set to 32?
Attention 模块一般放在深层网络层,本文中所有 Attention 模块都插入到了 yolov5s Backbone 的最后两层 C3 层(Behind C3 layer)(个人认为浅层特征层的特征表达能力已经足够强,可以不用 Attention 机制),更多信息可以参考 [yolov5_research] ( https://github.com/positive666/yolov5_research )。
YOLO中EMA_attention放在检测head后面,grouping时factor应该设置多少?如果输出是64或者32通道,factor还需要设置成32吗?
请问您找到最佳Factor参数的设置了吗
What is the position of the attention module added in the network when you conduct the experiment?