issues
search
SHI-Labs
/
Neighborhood-Attention-Transformer
Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.04k
stars
85
forks
source link
NA CUDA Extension v0.12!
#44
Closed
alihassanijr
closed
2 years ago
alihassanijr
commented
2 years ago
Changes to the extension:
Fixed the race condition in K-Backward and V-Backward kernels.
"Tiled" Neighborhood Attention kernels.
Improved FP16 support.
New 1D NA kernels
Window size templating
Gradchecks now run in fast mode.
Other changes:
Changelog added
README updated.
Changes to the extension:
Other changes: