Closed ooodragon94 closed 2 years ago
Besides convolutions and FCs ptflops also takes into account BNs and activations. May be large feature maps of ResNets and consequently large activations trigger the difference you mentioned.
Hey @sovrasov there's one thing though, your package returns MACs(G) and MACs(G) = 0.5 * FLOPs(G)[as you stated in a thread question]. So if I take the resnet34 example the FLOPs(G) in that case would be 1.84 which is not what's mentioned in their paper. So could you help me out in this?
@lucifermorningstar1305 see https://github.com/sovrasov/flops-counter.pytorch/issues/16#issuecomment-802631732
Hey @sovrasov, Thanks a lot!!
Hi, thanks for your great repo!
It seems like the calculated FLOPs for ResNet50 (4.12x10^9) does not match the result reported from paper 3.8x10^9
and ResNet101, ResNet152 is slightly different from the paper's result. (I used torchvision model)
May I ask why??
thank you very much