lyuwenyu / RT-DETR

[CVPR 2024] Official RT-DETR (RTDETR paddle pytorch), Real-Time DEtection TRansformer, DETRs Beat YOLOs on Real-time Object Detection. 🔥 🔥 🔥
Apache License 2.0
2.61k stars 303 forks source link

How to calculate FLOPs in pytorch code? #240

Closed BenjaminJonghyun closed 7 months ago

BenjaminJonghyun commented 7 months ago

Hello, I tried to calculate FLOPs in pytorch code with this repo(https://github.com/sovrasov/flops-counter.pytorch). `with torch.cuda.device(0):

    net = model
    macs, params = get_model_complexity_info(net, (3, 640, 640), as_strings=True,
                                            print_per_layer_stat=True, verbose=True)
    print('{:<30}  {:<8}'.format('Computational complexity: ', macs))
    print('{:<30}  {:<8}'.format('Number of parameters: ', params))

But, I got below warnings when I put above codes in L114 of src/solver/det_engeine.py. Could you help to solve this issue?

Warning: module ConvNormLayer is treated as a zero-op. Warning: module Identity is treated as a zero-op. Warning: module BasicBlock is treated as a zero-op. Warning: module Blocks is treated as a zero-op. Warning: module PResNet is treated as a zero-op. Warning: module NonDynamicallyQuantizableLinear is treated as a zero-op. Warning: module Dropout is treated as a zero-op. Warning: module MSDeformableAttention is treated as a zero-op. Warning: module TransformerDecoderLayer is treated as a zero-op. Warning: module TransformerDecoder is treated as a zero-op. Warning: module Embedding is treated as a zero-op. Warning: module MLP is treated as a zero-op. Warning: module RTDETRTransformer is treated as a zero-op. Warning: module TransformerEncoderLayer is treated as a zero-op. Warning: module TransformerEncoder is treated as a zero-op. Warning: module SiLU is treated as a zero-op. Warning: module ConvNormLayer is treated as a zero-op. Warning: module RepVggBlock is treated as a zero-op. Warning: module CSPRepLayer is treated as a zero-op. Warning: module HybridEncoder is treated as a zero-op. Warning: module RTDETR is treated as a zero-op. Warning: module RTDETRPostProcessor is treated as a zero-op. Warning: module Model is treated as a zero-op.

lyuwenyu commented 7 months ago

Sorry, I haven't used this before -> https://github.com/sovrasov/flops-counter.pytorch)


You can use profiler to get FLOPs https://pytorch.org/docs/stable/profiler.html#module-torch.profiler