houqb / CoordAttention

Code for our CVPR2021 paper coordinate attention
MIT License
1.02k stars 122 forks source link

Is there a performance improvement for CA #26

Closed fanghua2021 closed 3 years ago

fanghua2021 commented 3 years ago

I added the CA module to GhostNet, and the Map dropped. At the same time, adding the CA module to the Neck's Concat function also decreased the accuracy. Why? But, adding SEnet can increase the accuracy.

Fateeeeee commented 3 years ago

Do you solve it? I also met this problem.

fanghua2021 commented 3 years ago

No, I'm still reading the original article. Call the original author team for help

houqb commented 3 years ago

Not sure what happened. Be sure to add them after layers with channel expansion.