Closed lelelexxx closed 4 years ago
We use thop (https://github.com/Lyken17/pytorch-OpCounter) for flops counting.
We use thop (https://github.com/Lyken17/pytorch-OpCounter) for flops counting.
I have used OpCounter for FlOPs calculation on PSMNet, the result is 336.7G FLOPs, and 5.225M params. However, according to Table 2 in your paper. the FLOPs and params of PSMNet is 613.9G and 5.22M params. which is much different with ours result. Did i miss something and leads to the wrong result?
The FLOPs is dependent on the input resolution, we use 576x960 as input and FLOPs is measured for conv layers only.
The FLOPs is dependent on the input resolution, we use 576x960 as input and FLOPs is measured for conv layers only.
Get it! Thanks for your reply,
The FLOPs is dependent on the input resolution, we use 576x960 as input and FLOPs is measured for conv layers only.
I have set 1x3x576x960 as input and only take conv layers(include both 2d and 3d conv and transpose conv) into account, and get 1.08T FLOPs for PSMNet using pytorch-OpCounter. Is there anything wrong in my setting? Could you please provide the FLOPs detailed information? for example the custom ops? Any reply will be appreciated! Thanks!
Hi! Thanks for sharing such an excellent work. I am curious about the FLOPs of different networks, Which libs or tools did you use for FLOPs calculation? Is the FLOPs calculation code accessible in this repo?