Closed Animatory closed 4 years ago
Due to Conv2dARD creates bias before initialization of the parent class (torch.nn.Conv2d)
if bias: bias = Parameter(torch.Tensor(out_channels)) else: bias = None super(Conv2dARD, self).__init__(in_channels, out_channels, kernel_size, stride, padding, dilation, groups, bias)
torch.Tensor goes into In super().init therefore these lines in torch/nn/modules/conv.py are failed
torch/nn/modules/conv.py
42 if bias: 43 self.bias = Parameter(torch.Tensor(out_channels))
with RuntimeError: bool value of Tensor with more than one value is ambiguous
RuntimeError: bool value of Tensor with more than one value is ambiguous
Python 3.7.4 PyTorch 1.2
Hi @Animatory , same problem here. Have you fixed it?
Fixed! Conv2dARD.bias removed - it immediately crashes convergence. Use batchnorm instead
Due to Conv2dARD creates bias before initialization of the parent class (torch.nn.Conv2d)
torch.Tensor goes into In super().init therefore these lines in
torch/nn/modules/conv.py
are failedwith
RuntimeError: bool value of Tensor with more than one value is ambiguous
Python 3.7.4 PyTorch 1.2