Open bo-miao opened 3 years ago
One solution is to pad the image to make it divisible by 16. Could it possible to make the Light-HRNet compatible with any input size?
I met this problem,too. How can I pad the image to make it divisible by a number?
I met this problem,too. How can I pad the image to make it divisible by a number?
Just calculate the padded size, and pad the images with pad = nn.ZeroPad2d(padding=(0, pad_w, 0, pad_h))
I met this problem,too. How can I pad the image to make it divisible by a number?
Just calculate the padded size, and pad the images with pad = nn.ZeroPad2d(padding=(0, pad_w, 0, pad_h))
Good try!thanks
Hi,
I try to use Lite-HRNet on some videos with 480x854 resolution. But the following part has a size mismatch error (at row ~630). Following are the code and log of that part, how could I solve this issue?
Thanks in advance.
====================================== if self.with_fuse: out_fuse = [] for i in range(len(self.fuse_layers)): y = out[0] if i == 0 else self.fuse_layers[i]0 for j in range(self.num_branches): if i == j: y += out[j] else: print(i, j, y.shape, self.fuse_layers[i]j.shape) y += self.fuse_layers[i]j out_fuse.append(self.relu(y)) out = out_fuse
======================================
0 1 torch.Size([1, 40, 120, 214]) torch.Size([1, 40, 120, 214]) 1 0 torch.Size([1, 80, 60, 107]) torch.Size([1, 80, 60, 107]) 0 1 torch.Size([1, 40, 120, 214]) torch.Size([1, 40, 120, 214]) 1 0 torch.Size([1, 80, 60, 107]) torch.Size([1, 80, 60, 107]) 0 1 torch.Size([1, 40, 120, 214]) torch.Size([1, 40, 120, 214]) 1 0 torch.Size([1, 80, 60, 107]) torch.Size([1, 80, 60, 107]) 0 1 torch.Size([1, 40, 120, 214]) torch.Size([1, 40, 120, 214]) 0 2 torch.Size([1, 40, 120, 214]) torch.Size([1, 40, 120, 216])
========================================