open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
28.72k stars 9.32k forks source link

Model architecture summary - torchinfo/torchsummary #10252

Open fkroeber opened 1 year ago

fkroeber commented 1 year ago

I'd like to know if there is a way to get a model summary (preferably via torchinfo/troch-summary)? What I tried so far to get a summary for Mask-RCNN:

from torchinfo import summary summary(model, input_data=torch.tensor(test_img_in, dtype=torch.float))

This throws me an error in the DenseHead as displayed below. Any help is appreciated! The issue was already brought up here but doesn't seem to be solved yet.

I'm aware of the tools/analysis_tools/get_flops.py script to get a brief summary on the total number of parameters, etc. However, I'm searching for a way to get more in-depth insights on a model's structure.


TypeError Traceback (most recent call last) File c:\Users\felix.virtualenvs\field_boundaries\lib\site-packages\torchinfo\torchinfo.py:288, in forward_pass(model, x, batch_dim, cache_forwardpass, device, mode, **kwargs) 287 if isinstance(x, (list, tuple)): --> 288 = model.to(device)(*x, **kwargs) 289 elif isinstance(x, dict):

File c:\Users\felix.virtualenvs\field_boundaries\lib\site-packages\torch\nn\modules\module.py:1538, in Module._call_impl(self, *args, *kwargs) 1536 args = bw_hook.setup_input_hook(args) -> 1538 result = forward_call(args, **kwargs) 1539 if _global_forward_hooks or self._forward_hooks:

File c:\users\felix\repositories\mmdetection\mmdet\models\detectors\base.py:96, in BaseDetector.forward(self, inputs, data_samples, mode) 95 elif mode == 'tensor': ---> 96 return self._forward(inputs, data_samples) 97 else:

File c:\users\felix\repositories\mmdetection\mmdet\models\detectors\two_stage.py:134, in TwoStageDetector._forward(self, batch_inputs, batch_data_samples) 133 if self.with_rpn: --> 134 rpn_results_list = self.rpn_head.predict( 135 x, batch_data_samples, rescale=False) 136 else:

File c:\users\felix\repositories\mmdetection\mmdet\models\dense_heads\base_dense_head.py:191, in BaseDenseHead.predict(self, x, batch_data_samples, rescale) 175 """Perform forward propagation of the detection head and predict 176 detection results on the features of the upstream network. 177 (...) 189 after the post process. 190 """ --> 191 batch_img_metas = [ 192 data_samples.metainfo for data_samples in batch_data_samples 193 ] 195 outs = self(x)

TypeError: 'NoneType' object is not iterable

The above exception was the direct cause of the following exception:

RuntimeError Traceback (most recent call last) Cell In[31], line 2 1 from torchinfo import summary ----> 2 summary(model, input_data=torch.tensor(test_img_in, dtype=torch.float))

File c:\Users\felix.virtualenvs\field_boundaries\lib\site-packages\torchinfo\torchinfo.py:218, in summary(model, input_size, input_data, batch_dim, cache_forward_pass, col_names, col_width, depth, device, dtypes, mode, row_settings, verbose, kwargs) 211 validate_user_params( 212 input_data, input_size, columns, col_width, device, dtypes, verbose 213 ) 215 x, correct_input_size = process_input( 216 input_data, input_size, batch_dim, device, dtypes 217 ) --> 218 summary_list = forward_pass( 219 model, x, batch_dim, cache_forward_pass, device, model_mode, kwargs 220 ) 221 formatting = FormattingOptions(depth, verbose, columns, col_width, rows) 222 results = ModelStatistics( 223 summary_list, correct_input_size, get_total_memory_used(x), formatting 224 )

File c:\Users\felix.virtualenvs\field_boundaries\lib\site-packages\torchinfo\torchinfo.py:297, in forward_pass(model, x, batch_dim, cache_forward_pass, device, mode, **kwargs) 295 except Exception as e: 296 executed_layers = [layer for layer in summary_list if layer.executed] --> 297 raise RuntimeError( 298 "Failed to run torchinfo. See above stack traces for more details. " 299 f"Executed layers up to: {executed_layers}" 300 ) from e 301 finally: 302 if hooks:

RuntimeError: Failed to run torchinfo. See above stack traces for more details. Executed layers up to: [ResNet: 1, Conv2d: 2, BatchNorm2d: 2, ReLU: 2, MaxPool2d: 2, ResLayer: 2, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, Sequential: 4, Conv2d: 5, BatchNorm2d: 5, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, ResLayer: 2, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, Sequential: 4, Conv2d: 5, BatchNorm2d: 5, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, ResLayer: 2, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, Sequential: 4, Conv2d: 5, BatchNorm2d: 5, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, ResLayer: 2, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, Sequential: 4, Conv2d: 5, BatchNorm2d: 5, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Bottleneck: 3, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, Conv2d: 4, BatchNorm2d: 4, ReLU: 4, FPN: 1, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4, ConvModule: 3, Conv2d: 4]

kvnptl commented 9 months ago

I have the same questions. Is any solution available?