SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.05k stars 86 forks source link

PE added on query and key #37

Closed XiaoyuShi97 closed 2 years ago

XiaoyuShi97 commented 2 years ago

Hi. I see that current version only support PE as a bias weight added to attention map. I wonder if future version supports adding PE on query and key, which is another common way of PE. Thx again for your work and prompt reply!

alihassanijr commented 2 years ago

Hello and thanks for the interest.

Could you possibly refer a paper so we can look into it more? Our current version follows Swin in applying relative positional biases to attention weights based on the relative position of the queries and keys to each other.

alihassanijr commented 2 years ago

Closing this due to inactivity. If you still have questions feel free to open it back up.