martinarjovsky / WassersteinGAN

BSD 3-Clause "New" or "Revised" License
3.2k stars 725 forks source link

Why have a tensor of 1 or -1 in loss.backward()? #60

Closed arpan-dhatt closed 5 years ago

arpan-dhatt commented 6 years ago

What happens if we get rid of the mone and one in the .backward for the losses? What are they for?

FlyingCarrot commented 6 years ago

@NeuronCrossroads The parameters passed into backward will control the radio of the final gradients, so the output.backward() and output.backward(1) does the same operations. However, the output.backward(-1) will change the gradients into an opposite direction.