xiaoman-zhang / KAD

MIT License
122 stars 10 forks source link

About the attention map #3

Closed o0t1ng0o closed 1 year ago

o0t1ng0o commented 1 year ago

Hi @chaoyi-wu @xiaoman-zhang ,

Thank you for sharing this code. I am wondering how to obtain the attention maps in Figure 5 in your paper? image

Thanks.

xiaoman-zhang commented 1 year ago

To obtain the attention maps, we utilize the attention weights of the last multi-head attention layer. we have updated the 'plot_visualize_512.py' file in the './A3_CLIP' directory. This updated file includes the necessary code to generate the attention maps. Hope this will be helpful.