Closed atheeth96 closed 4 years ago
Hi,
When we run backward for G, we put minus one (-1) to calculate the gradients. This is kind of stupid during the implementation in the past.
It should be like this: gen_cost = - aD(fake_data) gen_cost.backward()
https://github.com/jalola/improved-wgan-pytorch/blob/master/train.py#L132
During the training of the generator, the gradients are calculated w.r.t D(G(z)) instead of -D(G(Z)).