Open joe660 opened 3 years ago
+1 also interested in the answer to this or if it would be possible to do in current darknet
What do you mean?
There are many attention apporaches: ASFF, SE, CBAM, SAM, Transformer ... And they can be placed in different places.
There are weights and cfg files with SAM: https://github.com/AlexeyAB/darknet/wiki/YOLOv4-model-zoo
What do you mean?
There are many attention apporaches: ASFF, SE, CBAM, SAM, Transformer ... And they can be placed in different places.
There are weights and cfg files with SAM: https://github.com/AlexeyAB/darknet/wiki/YOLOv4-model-zoo
Why do I add attention mechanism in front of each shortcut, and the final result of F1 is slightly lower than baseline
What do you mean? There are many attention apporaches: ASFF, SE, CBAM, SAM, Transformer ... And they can be placed in different places. There are weights and cfg files with SAM: https://github.com/AlexeyAB/darknet/wiki/YOLOv4-model-zoo
Why do I add attention mechanism in front of each shortcut, and the final result of F1 is slightly lower than baseline
I met the same problem. I asked several students that the effect of using attention mechanism in U version did not improve. I didn't meet it alone, but using attention mechanism in other general models would improve. I really don't know how to solve this problem
Why do I add attention mechanism in front of each shortcut, and the final result of F1 is slightly lower than baseline
Why do I add attention mechanism in front of each shortcut, and the final result of F1 is slightly lower than baseline
- Can you show 2 screenshots to compare?
- What is the attention mechanism?
- What is the baseline?
- What is the U version?
U version refers to the pytorch version of yolov4, the second author. Baseline means yolo.cfg In addition, Se, CBAM, ECA and other attention mechanisms are used. The results are not as good as the original, I added to each residual fast after. Where do you think it would be better to add it
@joe660 @WongKinYiu same question ,where should we add this [SAM] module in cfg?? in which scenario.
for u version, add sam to the beginning of csp block of pan will increase ~0.2% AP on COCO.
@joe660 老哥,请问你实现注意力机制的加入了没?
Does the author have a cfg file with attention mechanism Thanks