Open ch135 opened 5 years ago
Hello.
It seems that there are some problems with --chop
argument #140 .
I will try to fix it.
Thank you!
Hello.
It seems that there are some problems with
--chop
argument #140 .I will try to fix it.
Thank you!
Execute me! Have you fix it?
@thstkdgus35 Hi I am waiting for the same, I hope you may find some time to fix it soon, I was preparing to try it in another project, but failure of --chop stopped me badly. Thanks for the great code anyway, some tests surprisingly come out with "better then original" results, actually.
@ontheway16 could you please elaborate which tests and how? I'm curious and it might also help anyone in the future working on this too. Thanks.
Excuse me, have you fix that problem?
@fengshenfeilian Hello. Would you specify your input? I cannot reproduce this error with my images.
Hello, I am very grateful for your improved code.While, I have encountered the following problem when I run
python main.py --model EDSR --scale 2 --patch_size 96 --save edsr_baseline_x2 --reset --n_GPUs 2 --chop
Traceback (most recent call last): File "main.py", line 33, in
main()
File "main.py", line 28, in main
t.test()
File "/media/wangct/E7D0AC3987855C5C/ch_git/EDSR-PyTorch/src/trainer.py", line 89, in test
sr = self.model(lr, idx_scale)
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, kwargs)
File "/media/wangct/E7D0AC3987855C5C/ch_git/EDSR-PyTorch/src/model/init.py", line 57, in forward
return forward_function(x)
File "/media/wangct/E7D0AC3987855C5C/ch_git/EDSR-PyTorch/src/model/init.py", line 135, in forward_chop
y = self.forward_chop(p, shave=shave, min_size=min_size)
File "/media/wangct/E7D0AC3987855C5C/ch_git/EDSR-PyTorch/src/model/init.py", line 126, in forward_chop
y = P.data_parallel(self.model, x, range(n_GPUs))
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/parallel/data_parallel.py", line 188, in data_parallel
outputs = parallel_apply(replicas, inputs, module_kwargs, used_device_ids)
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/parallel/parallel_apply.py", line 83, in parallel_apply
raise output
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/parallel/parallel_apply.py", line 59, in _worker
output = module(*input, *kwargs)
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(input, kwargs)
File "/media/wangct/E7D0AC3987855C5C/ch_git/EDSR-PyTorch/src/model/edsr.py", line 52, in forward
x = self.sub_mean(x)
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, **kwargs)
File "/home/wangct/.conda/envs/chenhao/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 320, in forward
self.padding, self.dilation, self.groups)
RuntimeError: Expected 4-dimensional input for 4-dimensional weight [3, 3, 1, 1], but got 3-dimensional input of size [1, 184, 270] instead
Coucld you help me solv this problem. Looking forward to your busy reply!