sanghyun-son / EDSR-PyTorch

PyTorch version of the paper 'Enhanced Deep Residual Networks for Single Image Super-Resolution' (CVPRW 2017)
MIT License
2.41k stars 666 forks source link

when I train RCAN,something wrong:RuntimeError: Expected 4-dimensional input for 4-dimensional weight 3 3 1, but got 3-dimensional input of size [1, 184, 270] instead #271

Open countingstarsmer opened 4 years ago

countingstarsmer commented 4 years ago

python main.py --template RCAN --save RCAN_BIX2_G10R20P48 --scale 2 --reset --save_results --patch_size 96 and then

Traceback (most recent call last): File "main.py", line 33, in main() File "main.py", line 28, in main t.test() File "/home/zhj/EDSR-1.1.0/src/trainer.py", line 89, in test sr = self.model(lr, idx_scale) File "/home/zhj/anaconda3/envs/pytorch1.1/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, kwargs) File "/home/zhj/EDSR-1.1.0/src/model/init.py", line 57, in forward return forward_function(x) File "/home/zhj/EDSR-1.1.0/src/model/init.py", line 135, in forward_chop y = self.forward_chop(p, shave=shave, min_size=min_size) File "/home/zhj/EDSR-1.1.0/src/model/init.py", line 126, in forward_chop y = P.data_parallel(self.model, x, range(n_GPUs)) File "/home/zhj/anaconda3/envs/pytorch1.1/lib/python3.6/site-packages/torch/nn/parallel/data_parallel.py", line 204, in data_parallel return module(*inputs[0], *module_kwargs[0]) File "/home/zhj/anaconda3/envs/pytorch1.1/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(input, kwargs) File "/home/zhj/EDSR-1.1.0/src/model/rcan.py", line 107, in forward x = self.sub_mean(x) File "/home/zhj/anaconda3/envs/pytorch1.1/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/zhj/anaconda3/envs/pytorch1.1/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 338, in forward self.padding, self.dilation, self.groups) RuntimeError: Expected 4-dimensional input for 4-dimensional weight 3 3 1, but got 3-dimensional input of size [1, 184, 270] instead

blueardour commented 4 years ago

Same error on my machine

Raymondmax commented 3 years ago

+1

flymmmfly commented 3 years ago

how to solve it!!!help me

RuyuXu2019 commented 3 years ago

Same error on my machine

QiangLi1997 commented 3 years ago

You could refer to #184 .

Hunter-Murphy commented 3 years ago

in my opinion,the error mentioned above is caused by the mismatch of the number of dimension. Specifically,the required dimension is 4 whilst 3 was given,thus you should debug all the lines remain in the error message step by step in order to monitor the change of variables‘ dimension during the training process.

flauted commented 3 years ago

I was successful with the fix from #184 referenced above. In particular, model/__init__.py line 133 onwards became:

        else:
            for p in zip(*x_chops):
                p = [p_.unsqueeze(0) for p_ in p]
                y = self.forward_chop(*p, shave=shave, min_size=min_size)

I really don't have a good explanation but it seems to work.


I guess the gist of it is that x_chops contains a tensor for each input (args is List[Tensor(B x C x H x W)]) to forward_chop. That tensor is cut up into quarters and catted along the batch dimension. So now you have something along the lines of x_chops is List[Tensor(B*4 x C x H/4 x W/4)]. Then the "clever" line

for p in zip(*x_chops):

is equivalent to something like

for i in range(B*4):
    p = [x_ch[i, ...] for x_ch in x_chops]

which, as you can see when it's not so "clever", is going to drop the first dimension on each element in x_chops. Which is a problem because p is the recursive input to forward_chop. :(

wkiulu commented 3 years ago

I solved this problem by omitting "--chop" in option.(python==36, pytorch==1.1)