jalola / improved-wgan-pytorch

Improved WGAN in Pytorch
MIT License
439 stars 68 forks source link

RuntimeError: Mismatch in shape: grad_output[0] has a shape of torch.Size([1]) and output[0] has a shape of torch.Size([]) #11

Closed mcginley182 closed 4 years ago

mcginley182 commented 5 years ago

I am trying to upload my own dataset (64x64). When running the code I get the following: `RuntimeError Traceback (most recent call last)

in () 259 lib.plot.tick() 260 --> 261 train() 3 frames in train() 155 gen_cost = aD(fake_data) 156 gen_cost = gen_cost.mean() --> 157 gen_cost.backward(mone) 158 gen_cost = -gen_cost 159 /usr/local/lib/python3.6/dist-packages/torch/tensor.py in backward(self, gradient, retain_graph, create_graph) 148 products. Defaults to ``False``. 149 """ --> 150 torch.autograd.backward(self, gradient, retain_graph, create_graph) 151 152 def register_hook(self, hook): /usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables) 91 grad_tensors = list(grad_tensors) 92 ---> 93 grad_tensors = _make_grads(tensors, grad_tensors) 94 if retain_graph is None: 95 retain_graph = create_graph /usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py in _make_grads(outputs, grads) 27 + str(grad.shape) + " and output[" 28 + str(outputs.index(out)) + "] has a shape of " ---> 29 + str(out.shape) + ".") 30 new_grads.append(grad) 31 elif grad is None: RuntimeError: Mismatch in shape: grad_output[0] has a shape of torch.Size([1]) and output[0] has a shape of torch.Size([]).` Any advice how to fix this?
jalola commented 5 years ago

Hi,

Can you check why output[0] has a shape of torch.Size([])?

Maybe you can put debug command or logging or print after this 156 gen_cost = gen_cost.mean() to see why it has size of 0

mcginley182 commented 5 years ago

To Resolve this issue, one = torch.FloatTensor([1]) needs to be changed to one = torch.tensor(1, dtype=torch.float)

jalola commented 5 years ago

great! which torch version are you using?