microsoft / Focal-Transformer

[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
MIT License
546 stars 59 forks source link

Some question about focal_transformer_v2.py #16

Open DQiaole opened 2 years ago

DQiaole commented 2 years ago

Hello, should the range_h and range_w on line 176 and line 177 of the file (https://github.com/microsoft/Focal-Transformer/blob/main/classification/focal_transformer_v2.py) be different at different focal levels? The same size range_h and range_w are used at different focal levels in the code; this makes me very puzzled.