Closed Gregwar closed 2 years ago
Describe the bug If using both nn.Parameter and modules, the nn.Parameter doesn't show up. However, if not using the module, the nn.Parameter appears.
nn.Parameter
To Reproduce Example 1:
class Net(th.nn.Module): def __init__(self): super().__init__() self.w = th.nn.Parameter(th.zeros(1024)) """ Outputs: ================================================================= Layer (type:depth-idx) Param # ================================================================= Net 1,024 ================================================================= Total params: 1,024 Trainable params: 1,024 Non-trainable params: 0 ================================================================= """
Example 2:
class Net(th.nn.Module): def __init__(self): super().__init__() self.w = th.nn.Parameter(th.zeros(1024)) self.conv = th.nn.Conv2d(3, 6, 3) """ Outputs: ================================================================= Layer (type:depth-idx) Param # ================================================================= Net -- ├─Conv2d: 1-1 168 ================================================================= Total params: 168 Trainable params: 168pi Non-trainable params: 0 ================================================================= """
Expected behavior I'd expect the 1024 to appear in front of Net in the second case
1024
Net
Desktop (please complete the following information):
1.6.6
This has been fixed in d6e5dfa and will be released in torchinfo v1.7.0.
Thank you for reporting this issue!
Describe the bug If using both
nn.Parameter
and modules, thenn.Parameter
doesn't show up. However, if not using the module, thenn.Parameter
appears.To Reproduce Example 1:
Example 2:
Expected behavior I'd expect the
1024
to appear in front ofNet
in the second caseDesktop (please complete the following information):
1.6.6