NJU-LHRS / official-CMID

The official implementation of paper "Unified Self-Supervised Learning Framework for Remote Sensing Images".
78 stars 5 forks source link

How to obtain the attention maps of CMID-Swin-B in Fig.12? #13

Closed tulilin closed 11 months ago

tulilin commented 1 year ago

Thank you for your wonderful work of CMID and I have read your paper and code. In Fig.12 of the paper, the attention maps of CMID-Swin-B were analyzed. I am curious that how to obtain the attention maps of Swin Transformer considering that it does not have the [class] token similar to ViT? Will the code be provided? I am looking forward to your reply. Thank you!

pUmpKin-Co commented 1 year ago

Hi~Thanks for your interest in our work. Actually, we extract the first attention map of the last block in Swin-Transformer and then interpolate to the original size of input images. I have added the corresponding code (Pretrain/attn_vis_config.yaml, Pretrain/main_attn_vis.py, Pretrain/models/swin_transformer_with_attn.py). You may need to replace the original code in Pretrain/models/swin_transformer with new Pretrain/models/swin_transformer_with_attn.py.

This implementation is highly inspired and borrowed from DINO and iBOT. You may need to these repos for detailed information.

pUmpKin-Co commented 11 months ago

Closed as long periods of inactivity, feel free to reopen if there is any problem.

tulilin commented 11 months ago

Hi~Thanks for your interest in our work. Actually, we extract the first attention map of the last block in Swin-Transformer and then interpolate to the original size of input images. I have added the corresponding code (Pretrain/attn_vis_config.yaml, Pretrain/main_attn_vis.py, Pretrain/models/swin_transformer_with_attn.py). You may need to replace the original code in Pretrain/models/swin_transformer with new Pretrain/models/swin_transformer_with_attn.py.

This implementation is highly inspired and borrowed from DINO and iBOT. You may need to these repos for detailed information.

Got it! Thank you very much!