Closed Vegetabhl closed 3 years ago
You can use BatchNorm1d there
First of all thank you for your reply. I tried it and still couldn't,because my input shape are [batch, channel, h, w], I can only use BatchNorm2d, but I still get the same error when I use BatchNorm2d. I found that the error occurred after the F.adaptive_avg_pool2d() function. After this function,the shape of tensor became [1, 64,, 1, 1],I learned that the BN layer cannot accept tensors of this shape,But my batch is not set to 1,so I am confused now😣
I see the issue.
For training, the featuremap size X batch size
should be greater than 1. As the featuremap size is 1 after average pooling, the batch size has to be greater than 1.
For evaluation, it is okay. Please use model.eval()
before doing inference.
When I try to integrate the attention module into yolov3, if norm_ If layer is not set to none, an error will be reported: ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 64, 1, 1]) I would like to ask how to solve this problem?