xmu-xiaoma666 / External-Attention-pytorch

🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
MIT License
11.11k stars 1.91k forks source link

MLP Confusion #51

Open abhimanyuchadha96 opened 2 years ago

abhimanyuchadha96 commented 2 years ago

https://github.com/xmu-xiaoma666/External-Attention-pytorch/blob/2f80b03ef1cdd835d4a2d21eff6f8b3534e5d601/model/attention/CoAtNet.py#L21

Correct me, if I am wrong but isn't MLP usually a collection of fully-connected layers and not convolution layers?

EricPengShuai commented 1 year ago

same with CBAM ... https://github.com/xmu-xiaoma666/External-Attention-pytorch/blob/2f80b03ef1cdd835d4a2d21eff6f8b3534e5d601/model/attention/CBAM.py#L13