[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
545
stars
59
forks
source link
Your computational complexity is much higher than swin transformer #9
Open
zhouyi-git opened 2 years ago
I take swin transformer to test a segmentation and compare you. Found you twice as long as swin?