Open duhd1993 opened 5 years ago
A simple example, no issues When added batchnorm1d between layers,
Expected more than 1 value per channel when training, got input size torch.Size([1, 100])
It seems like it's treating batchnorm1d as batchnorm2d?
A simple example, no issues When added batchnorm1d between layers,
It seems like it's treating batchnorm1d as batchnorm2d?