sovrasov / flops-counter.pytorch

Flops counter for convolutional networks in pytorch framework
MIT License
2.83k stars 306 forks source link

Disable patch_tensor_ops for modules with custom_modules_hooks #139

Closed Rikorose closed 4 months ago

Rikorose commented 5 months ago

I have some custom layers where I implemented the flop counting manually in custom_modules_hooks. This enables nice outputs during print_per_layer_stats so I know the flops and params for the corresponding layers in a larger model.

However, some ops are counted also in patch_tensor_ops which results in the final output being twice as large as printed in the per layer stats:

1604a6vh

Ideally, patch_tensor_ops is not applied in modules with a custom hook.

sovrasov commented 5 months ago

Thanks for reporting this. patch_tensor_ops was an attempt to add support of transformers. Now, ptflops has aten backend to handle transformer, so I'll consider disabling patch_tensor_ops

sovrasov commented 4 months ago

After #140 you can pass backend_specific_config={'count_functional': False} to disable counting functionals, which would workaround your problem. Also, to use torch backend passing backend=FLOPS_BACKEND.PYTORCH is required, since the defauls backend now is aten.