will fail because the type is an integer, not a torch tensor. This may seem like a weird case (when are there no parameters), but I came across this after writing 'wrapper' Modules for modules with parameters. These wrappers had no parameters themselves.
The fix is just to check the parameter count is non-zero, otherwise remove the 'numpy()' call.
Hi,
If the number of parameters of the model is zero, then the call for (line 102):
total_params_size = abs(total_params.numpy() * 4. / (1024 ** 2.))
will fail because the type is an integer, not a torch tensor. This may seem like a weird case (when are there no parameters), but I came across this after writing 'wrapper' Modules for modules with parameters. These wrappers had no parameters themselves.
The fix is just to check the parameter count is non-zero, otherwise remove the 'numpy()' call.