microsoft / Focal-Transformer

[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
MIT License
545 stars 59 forks source link

num_heads value #14

Closed kimjisoo12 closed 2 years ago

kimjisoo12 commented 2 years ago

hi,I'm sorry to bother you . I would like to know whether the num_heads values in each of the four stages in your code are the same, or whether the num_heads values in each stage are fixed or related to something else

jwyang commented 2 years ago

Hi, the number of heads are specified same to Swin. For example, for Focal-tiny, the number of heads at four stages are 3-6-12-24, as shown here.