@datvuthanh @xoiga123
Hi. I am receiving this error when I pass a .jpg input image of size (2160,4096,3) for testing. Can you please help me resolve this issue? Thank you!
Command I ran - python hybridnets_test.py -w weights/hybridnets.pth --source demo/image --output demo_result --imshow False --imwrite True
Traceback (most recent call last):
File "hybridnets_test.py", line 121, in <module>
features, regression, classification, anchors, seg = model(x)
File "env/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "folder/HybridNets/backbone.py", line 104, in forward
features = self.bifpn(features)
File "env/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "env/lib/python3.8/site-packages/torch/nn/modules/container.py", line 141, in forward
input = module(input)
File "env/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "folder/HybridNets/hybridnets/model.py", line 179, in forward
outs = self._forward_fast_attention(inputs)
File "folder/HybridNets/hybridnets/model.py", line 211, in _forward_fast_attention
p5_up = self.conv5_up(self.swish(weight[0] * p5_in + weight[1] * self.p5_upsample(p6_up)))
RuntimeError: The size of tensor a (11) must match the size of tensor b (12) at non-singleton dimension 2
Hi @Gateway2745 , this is because the input size need to have the ratio 32x. 2160 : 32 = 67.5 not integer. So, you must resize your input (my suggestion is the input is 640,384).
@datvuthanh @xoiga123 Hi. I am receiving this error when I pass a .jpg input image of size (2160,4096,3) for testing. Can you please help me resolve this issue? Thank you!
Command I ran -
python hybridnets_test.py -w weights/hybridnets.pth --source demo/image --output demo_result --imshow False --imwrite True