akanimax / pro_gan_pytorch

Unofficial PyTorch implementation of the paper titled "Progressive growing of GANs for improved Quality, Stability, and Variation"
MIT License
537 stars 99 forks source link

Possible typo in LSGAN loss #13

Closed ngoyal2707 closed 5 years ago

ngoyal2707 commented 5 years ago

The current code of LSGAN loss does following:

0.5 * ((th.mean(self.dis(fake_samps, height, alpha)) - 1) ** 2)

which is square of mean of error, probably should be mean of squared error as following:

0.5 * (th.mean((self.dis(fake_samps, height, alpha) - 1) ** 2))

If above is correct, same applies to both generator and discriminator of LSGAN and LSGANSigmoid.

Note: I haven't actually run the above code, just noticed it while looking at the other losses in your code.

akanimax commented 5 years ago

@ngoyal2707,

Thank you for pointing out. Even I had never used the LSGAN with ProGAN as the paper had mentioned some additional stabilization tricks involving noise addition. Will make the required correction soon.

Best regards, @akanimax

ngoyal2707 commented 5 years ago

@akanimax Sure, I was using LSGAN in ProGAN with noise trick from paper which is giving decent result but sometimes still get random mode collapses where suddenly after some epochs, Generator starts producing random noise.

akanimax commented 5 years ago

@ngoyal2707, Interesting observation. I'll try it sometime too. BTW, the error is fixed in 0368a2665b52b843785ffe0b18b3154fd32539b0.

Closing the issue now. Thanks once again.

Cheers :beers:! @akanimax