zhanghang1989 / PyTorch-Encoding

A CV toolkit for my papers.
https://hangzhang.org/PyTorch-Encoding/
MIT License
2.04k stars 452 forks source link

Error about output = module(*(input + target), **kwargs) #151

Open suyanzhou626 opened 5 years ago

suyanzhou626 commented 5 years ago

When I use DataParallelModel and DataParallelCriterion, Wrong Occur, like ` Traceback (most recent call last): File "train_SyncBN.py", line 151, in loss = criterion(outputs, labels) File "/home/phd-1/.conda/envs/pytorch-0.41/lib/python3.6/site-packages/torch/nn/modules/module.py", line 477, in call result = self.forward(*input, *kwargs) File "/home/phd-1/.conda/envs/pytorch-0.41/lib/python3.6/site-packages/encoding/parallel.py", line 134, in forward outputs = _criterion_parallel_apply(replicas, inputs, targets, kwargs) File "/home/phd-1/.conda/envs/pytorch-0.41/lib/python3.6/site-packages/encoding/parallel.py", line 188, in _criterion_parallel_apply raise output File "/home/phd-1/.conda/envs/pytorch-0.41/lib/python3.6/site-packages/encoding/parallel.py", line 163, in _worker output = module((input + target), **kwargs) TypeError: add() received an invalid combination of arguments - got (tuple), but expected one of:

zhanghang1989 commented 5 years ago

Please return (x,)

leezonpen commented 5 years ago

@suyanzhou626 do you solve this problem?

artzers commented 5 years ago

@leezonpen @suyanzhou626 in your model code, write as : class Model(XXX): def forward(self,XXX): //XXXX your net code x = layer(x) return (x,) // instead of "return x"

leezonpen commented 5 years ago

@artzers it don't work for me. I gather the output and slice it for multi-task loss (scatter before input to the criterion). It seem like that some problem apeared in the DataParallelCriterion. How can i solve it?

artzers commented 5 years ago

I can run it for simple mission such as super resolution or segment. I haven't test multi-task loss mission. I'll try later, or you can continue to contact with @zhanghang1989