cmsflash / efficient-attention

An implementation of the efficient attention module.
https://arxiv.org/abs/1812.01243
MIT License
272 stars 26 forks source link

dimension problem #11

Open feimadada opened 1 year ago

feimadada commented 1 year ago

the input dimension is (1, 128, 32, 32) or (1, 256, 16, 16). I try to use efficient attention ,but the memory and Gflops are still big

cmsflash commented 1 year ago

Could you provide more specific numbers (i.e. exact FLOPs and memory)?