net = model
macs, params = get_model_complexity_info(net, (3, 640, 640), as_strings=True,
print_per_layer_stat=True, verbose=True)
print('{:<30} {:<8}'.format('Computational complexity: ', macs))
print('{:<30} {:<8}'.format('Number of parameters: ', params))
But, I got below warnings when I put above codes in L114 of src/solver/det_engeine.py.
Could you help to solve this issue?
Warning: module ConvNormLayer is treated as a zero-op. Warning: module Identity is treated as a zero-op. Warning: module BasicBlock is treated as a zero-op. Warning: module Blocks is treated as a zero-op. Warning: module PResNet is treated as a zero-op. Warning: module NonDynamicallyQuantizableLinear is treated as a zero-op. Warning: module Dropout is treated as a zero-op. Warning: module MSDeformableAttention is treated as a zero-op. Warning: module TransformerDecoderLayer is treated as a zero-op. Warning: module TransformerDecoder is treated as a zero-op. Warning: module Embedding is treated as a zero-op. Warning: module MLP is treated as a zero-op. Warning: module RTDETRTransformer is treated as a zero-op. Warning: module TransformerEncoderLayer is treated as a zero-op. Warning: module TransformerEncoder is treated as a zero-op. Warning: module SiLU is treated as a zero-op. Warning: module ConvNormLayer is treated as a zero-op. Warning: module RepVggBlock is treated as a zero-op. Warning: module CSPRepLayer is treated as a zero-op. Warning: module HybridEncoder is treated as a zero-op. Warning: module RTDETR is treated as a zero-op. Warning: module RTDETRPostProcessor is treated as a zero-op. Warning: module Model is treated as a zero-op.
Hello, I tried to calculate FLOPs in pytorch code with this repo(https://github.com/sovrasov/flops-counter.pytorch). `with torch.cuda.device(0):
But, I got below warnings when I put above codes in L114 of
src/solver/det_engeine.py.
Could you help to solve this issue?Warning: module ConvNormLayer is treated as a zero-op. Warning: module Identity is treated as a zero-op. Warning: module BasicBlock is treated as a zero-op. Warning: module Blocks is treated as a zero-op. Warning: module PResNet is treated as a zero-op. Warning: module NonDynamicallyQuantizableLinear is treated as a zero-op. Warning: module Dropout is treated as a zero-op. Warning: module MSDeformableAttention is treated as a zero-op. Warning: module TransformerDecoderLayer is treated as a zero-op. Warning: module TransformerDecoder is treated as a zero-op. Warning: module Embedding is treated as a zero-op. Warning: module MLP is treated as a zero-op. Warning: module RTDETRTransformer is treated as a zero-op. Warning: module TransformerEncoderLayer is treated as a zero-op. Warning: module TransformerEncoder is treated as a zero-op. Warning: module SiLU is treated as a zero-op. Warning: module ConvNormLayer is treated as a zero-op. Warning: module RepVggBlock is treated as a zero-op. Warning: module CSPRepLayer is treated as a zero-op. Warning: module HybridEncoder is treated as a zero-op. Warning: module RTDETR is treated as a zero-op. Warning: module RTDETRPostProcessor is treated as a zero-op. Warning: module Model is treated as a zero-op.