SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.05k stars 86 forks source link

Tiny Bug in nattencuda.py #46

Closed z-jiaming closed 2 years ago

z-jiaming commented 2 years ago

Great Work!

Well, I found a small bug in nattencuda.py.

https://github.com/SHI-Labs/Neighborhood-Attention-Transformer/blob/a9a7580b112a20db48dbc0fc64d25bcd75c974d5/natten/nattencuda.py#L117

it should be self.kernel_size during padding if feature size is small than kernel_size

alihassanijr commented 2 years ago

Thank you for your interest, and for bringing this to our attention.