Open AiHaiHai opened 3 months ago
Transformer support is not full in the torch backend, to fix that you could switch to aten:
import torchvision.models as models
from ptflops import get_model_complexity_info
net = models.swin_t(num_classes=1000)
macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten')
print('{:<30} {:<8}'.format('Computational complexity: ', macs))
print('{:<30} {:<8}'.format('Number of parameters: ', params))
Computational complexity: 4.5 GMac
Number of parameters: 28.29 M
Transformer 支持在 torch 后端并不完全,要修复您可以切换到 aten 的问题:
import torchvision.models as models from ptflops import get_model_complexity_info net = models.swin_t(num_classes=1000) macs, params = get_model_complexity_info(net, (3, 224, 224), as_strings=True, print_per_layer_stat=False, backend='aten') print('{:<30} {:<8}'.format('Computational complexity: ', macs)) print('{:<30} {:<8}'.format('Number of parameters: ', params))
Computational complexity: 4.5 GMac Number of parameters: 28.29 M
Thanks a lot, it works now.
The FLOPs of swin_t is 4.5G, but I get 3.13G here. Am I using it the wrong way?