microsoft / Focal-Transformer

[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
MIT License
545 stars 59 forks source link

Your computational complexity is much higher than swin transformer #9

Open zhouyi-git opened 2 years ago

zhouyi-git commented 2 years ago

I take swin transformer to test a segmentation and compare you. Found you twice as long as swin?

zhouyi-git commented 2 years ago

Can you give me some advice