LeapLabTHU / FLatten-Transformer

Official repository of FLatten Transformer (ICCV2023)
401 stars 24 forks source link

您好,您的工作非常有意义和参考价值,我像用来嫁接到CNN特征提取网络中,用于P345特征图之后,做一注意力机制,请问您代码支持吗?非常期待你的回信!十分感谢 #6

Closed 1324039468 closed 1 year ago

tian-qing001 commented 1 year ago

Certainly. Our module can serve as a plug-in module and be adopted on a variety of modern CNN and ViT architectures. To implement this, I would recommend referring to the FocusedLinearAttention module in swin.py and integrating it into your CNN models as needed.