frgfm / torch-scan

Seamless analysis of your PyTorch models (RAM usage, FLOPs, MACs, receptive field, etc.)
https://frgfm.github.io/torch-scan/latest
Apache License 2.0
208 stars 22 forks source link

multiple outputs not supported? #80

Open muzafferkal opened 1 year ago

muzafferkal commented 1 year ago

Bug description

If a network returns multiple outputs, the forward hook at crawler.py:181 crashes because the out parameter is a tuple and not a tensor so it doesn't support .size() method. Are module returning multiple outputs supported and I'm doing wrong or is this a bug? Thanks.

Code snippet to reproduce the bug

out_h, out_o = model(x_crnt, x_prev, init_h, init_o)

Error traceback

  File "/local/home/...../lib64/python3.7/site-packages/torchscan/crawler.py", line 189, in _fwd_hook
    info[fw_idx]["output_shape"] = (-1, *out.shape[1:])
AttributeError: 'tuple' object has no attribute 'shape'

Environment

Collecting environment information...
TorchScan version: 0.1.2
PyTorch version: 1.8.1+cu102

OS: Amazon Linux 2

Python version: 3.7.16
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration: 
GPU 0: Tesla V100-SXM2-16GB
GPU 1: Tesla V100-SXM2-16GB
GPU 2: Tesla V100-SXM2-16GB
GPU 3: Tesla V100-SXM2-16GB

Nvidia driver version: 525.60.13
cuDNN version: Could not collect
frgfm commented 8 months ago

Hey there @muzafferkal :wave:

My apologies about the very late reply! Could you share a minimal reproducible snippet? I understand that your issue is with a model that has mutiple inputs to custom PyTorch modules. But additional context & snippet would help to come up with the best solution!

Cheers :v: