MrYxJ / calculate-flops.pytorch

The calflops is designed to calculate FLOPs、MACs and Parameters in all various neural networks, such as Linear、 CNN、 RNN、 GCN、Transformer(Bert、LlaMA etc Large Language Model)
https://pypi.org/project/calflops/
MIT License
392 stars 14 forks source link

关于百分比的问题 #27

Closed lzd-1230 closed 1 month ago

lzd-1230 commented 1 month ago

请问我使用了如下代码测试resnet50

from calflops import calculate_flops
from torchvision import models

model = models.resnet50()
calculate_flops(model, input_shape=(1, 3, 224, 224))

输出的第一行 25.56 M = 100% Params, 4.09 GMACs = 100% MACs, 8.21 GFLOPS = 49.8% FLOPs 为什么整个模型的FLOPS的百分比只有 49.8%? 若是总体按照fwd+bwd来看, 也应该30%多吧? 不太清楚这个值是怎么得到的

------------------------------------- Calculate Flops Results -------------------------------------
Notations:
number of parameters (Params), number of multiply-accumulate operations(MACs),
number of floating-point operations (FLOPs), floating-point operations per second (FLOPS),
fwd FLOPs (model forward propagation FLOPs), bwd FLOPs (model backward propagation FLOPs),
default model backpropagation takes 2.00 times as much computation as forward propagation.

Total Training Params:                                                  25.56 M 
fwd MACs:                                                               4.09 GMACs
fwd FLOPs:                                                              8.21 GFLOPS
fwd+bwd MACs:                                                           12.27 GMACs
fwd+bwd FLOPs:                                                          24.63 GFLOPS

-------------------------------- Detailed Calculated FLOPs Results --------------------------------
Each module caculated is listed after its name in the following order: 
params, percentage of total params, MACs, percentage of total MACs, FLOPS, percentage of total FLOPs

Note: 1. A module can have torch.nn.module or torch.nn.functional to compute logits (e.g. CrossEntropyLoss). 
 They are not counted as submodules in calflops and not to be printed out. However they make up the difference between a parent's MACs and the sum of its submodules'.
2. Number of floating-point operations is a theoretical estimation, thus FLOPS computed using that could be larger than the maximum system throughput.

ResNet(
  25.56 M = 100% Params, 4.09 GMACs = 100% MACs, 8.21 GFLOPS = 49.8% FLOPs

后面的层数若按照8.21 GFLOPS的总量来算, 也少了一半

MrYxJ commented 1 month ago

您好,我已经在最新版本calflops中修复 Flops 百分比显示一半的问题,这里 49.8% 实际应该是100%。 您可以查看 https://pypi.org/project/calflops/,在版本 >=0.3.1之后都修复了这个问题 使用命令即可更新:

pip install --upgrade calflops 
lzd-1230 commented 1 month ago

@MrYxJ 感谢你快速的回复! 我还想问一下是否有方法能够快速输出格式化的信息? 比如快速输出到csv或xlsx, 我发现不同模型输出的格式会有少许变化, 用正则表达式提取输出的字符信息也比较麻烦, 因此想咨询一下是否有快捷的方法统计各层的flops到结构化的表格中