sovrasov / flops-counter.pytorch

Flops counter for convolutional networks in pytorch framework
MIT License
2.83k stars 306 forks source link

Flops of Swin Transformer #142

Open AiHaiHai opened 3 months ago

AiHaiHai commented 3 months ago

The FLOPs of swin_t is 4.5G, but I get 3.13G here. Am I using it the wrong way?

import torchvision.models as models
from ptflops import get_model_complexity_info

net = models.swin_t(num_classes=1000)
macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False)
print('{:<30}  {:<8}'.format('Computational complexity: ', macs))
print('{:<30}  {:<8}'.format('Number of parameters: ', params))
Computational complexity:       3.13 GMac
Number of parameters:           28.29 M 

image

sovrasov commented 3 months ago

Transformer support is not full in the torch backend, to fix that you could switch to aten:

import torchvision.models as models
from ptflops import get_model_complexity_info

net = models.swin_t(num_classes=1000)
macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten')
print('{:<30}  {:<8}'.format('Computational complexity: ', macs))
print('{:<30}  {:<8}'.format('Number of parameters: ', params))
Computational complexity:       4.5 GMac
Number of parameters:           28.29 M 
AiHaiHai commented 3 months ago

Transformer 支持在 torch 后端并不完全,要修复您可以切换到 aten 的问题:

import torchvision.models as models
from ptflops import get_model_complexity_info

net = models.swin_t(num_classes=1000)
macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten')
print('{:<30}  {:<8}'.format('Computational complexity: ', macs))
print('{:<30}  {:<8}'.format('Number of parameters: ', params))
Computational complexity:       4.5 GMac
Number of parameters:           28.29 M 

Thanks a lot, it works now.