LeapLabTHU / FLatten-Transformer

Official repository of FLatten Transformer (ICCV2023)
377 stars 21 forks source link

注意力图可视化 #13

Closed CodeStarting-design closed 10 months ago

CodeStarting-design commented 10 months ago

感谢您杰出的工作,我对FLatten的工作非常感兴趣,我尝试使用与文中类似的网格可视化注意力图,但是每一个patch相较于原图尺寸是极小的,请问文中的效果是如何得到的,能否提供相应的可以话代码。希望能够得到您的回复! image

tian-qing001 commented 10 months ago

Hi @CodeStarting-design, we genuinely value your interest in and recognition of our work. The visualization results presented in the paper were generated using DeiT, where the patch size is relatively large. We provide the code for visualization right here, with the hope that it could address your needs.