Closed BeAShaper closed 3 years ago
I am unable to reproduce this error, my apologies.
(base) sniklaus@sniklaus:~$ python run.py
Downloading: "http://content.sniklaus.com/github/pytorch-pwc/network-default.pytorch" to /home/sniklaus/.cache/torch/hub/checkpoints/pwc-default
100%|██████████████████████████████████████████████████████████| 35.8M/35.8M [00:09<00:00, 4.13MB/s]
(base) sniklaus@sniklaus:~$ stat out.flo
File: out.flo
Size: 3571724 Blocks: 6984 IO Block: 4096 regular file
Device: fd01h/64769d Inode: 23861123 Links: 1
Access: (0664/-rw-rw-r--) Uid: ( 1000/sniklaus) Gid: ( 1000/sniklaus)
Access: 2021-03-11 08:44:21.338299315 -0800
Modify: 2021-03-11 08:44:21.334299352 -0800
Change: 2021-03-11 08:44:21.334299352 -0800
Birth: -
(base) sniklaus@sniklaus:~$
You also seem to have made some changes, your file is called run_pwc.py
instead of run.py
and I am not sure what you changed to cause this error.
hey, thank you fro sharing! When I ran the code follow the command, there occured an error.
File "run_pwc.py", line 207, in forward tenFeat = torch.cat([ self.netOne(tenFeat), tenFeat ], 1) File "D:\Anaconda3\envs\torch\lib\site-packages\torch\nn\modules\module.py", line 722, in _call_impl result = self.forward(*input, **kwargs) File "D:\Anaconda3\envs\torch\lib\site-packages\torch\nn\modules\container.py", line 117, in forward input = module(input) File "D:\Anaconda3\envs\torch\lib\site-packages\torch\nn\modules\module.py", line 722, in _call_impl result = self.forward(*input, **kwargs) File "D:\Anaconda3\envs\torch\lib\site-packages\torch\nn\modules\conv.py", line 419, in forward return self._conv_forward(input, self.weight) File "D:\Anaconda3\envs\torch\lib\site-packages\torch\nn\modules\conv.py", line 416, in _conv_forward self.padding, self.dilation, self.groups) RuntimeError: Given groups=1, weight of size [128, 81, 3, 3], expected input[1, 49, 4, 8] to have 81 channels, but got 49 channels instead
It seems like the dimension is wrong in somewhere. Could you help me to figure out it? I will be really appreciate.