akanimax / pro_gan_pytorch

Unofficial PyTorch implementation of the paper titled "Progressive growing of GANs for improved Quality, Stability, and Variation"
MIT License
537 stars 99 forks source link

Q: Equalized convolution do they make a big difference? #17

Closed Mut1nyJD closed 5 years ago

Mut1nyJD commented 5 years ago

Thank you for the nice implementation of ProGan it is far more readable than the tensorflow version.

I was just wondering if using the special equalized convolution in your experience made any big difference in the visual results?

akanimax commented 5 years ago

@Mut1nyJD,

First of all, Thanks a lot for your compliment. It means a lot to me.

To answer your question, Yes! it does make a huge difference. I can tell this, because if you look at some of the earlier commits, there was a bug in the equalized learning rate and the results were not coming anywhere close to the visual results of the ProGAN. Besides, I really think that the equalized learning rate does improve the stability of the training.

Hope it helps.

Best regards, @akanimax

Mut1nyJD commented 5 years ago

@akanimax No problem my pleasure it is great to see that people go the extra effort to produce something that is actually readable. :)

Thanks for your answer I see interesting what do you mean with visual results the quality of individual image or the diversity on the latent space? It looks to me that the equalized learning weight for the convolution layers might act similar to spectral normalization that is used for example in SAGAN or BigGAN. But I thought maybe the PixelNorm already takes care of most of that.

I am still trying to figure out on which of the the two it is worth to spend more time on if it is BigGAN or ProGan. Both seem to produce very impressive results.

In your experience which of the learning schemes WGAN, WGAN-GP, LSGAN, SGAN or does it not matter at all?

Also have you tried using asymmetric learning rates for Discriminator and Generator which is suggested for example in SAGAN?

akanimax commented 5 years ago

Hi @Mut1nyJD,

1.) By Visual quality, I mean the individual images. Although I haven't done the ablation study myself, but from my understanding, it is the MiniBatchStdDev layer in the discriminator that takes care of the sample diversity.

2.) The pixNorm restricts the spatial feature vector lengths to unit norm which controls the Generator's weight value escalation while the equalized learning rate speeds up the training similar to careful weight initialization as in He_initialization or Xavier_initialization.

3.) BigGAN and ProGAN are both amazing papers, but BigGANs scores are way way higher than others. Perhaps BigGAN study should be preferred; It is my personal opinion, though. Also, check out StyleGAN which superseded ProGAN.

4.) Well all the losses are more or less the same (Are all GANs created Equal?). But do check out Relativistic versions of these losses. From experience, Relativistic Average HingeGAN is very stable (magically stable) and converges for any dataset and doesn't require TTUR mostly.

Hope this helps.

Best regards, @akanimax

OranjeeGeneral commented 5 years ago

Thanks for your additional comments indeed these are quiet helpful.

On 3.) Yes I know BigGAN seems to have impressive results in terms of some of the metric scores but it is hard to say how well it compares actually to ProGAN as there are have not run it on the same datasets so. Also BigGAN looks a bit more complex with the attention mechanism and spectral norm and the training scheme then ProGAN. I kind of like the simplicity of ProGAN and as you mentioned it is a step stone to StyleGAN if you want to add that on top you need the basis of ProGAN.

On 4.) That's very interesting thank you I I will have a look at RA Hinge Loss. I tried WGAN, WGAN-GP and LSGAN in the past on different Generator/Discriminator architectures and they all start to become instable after 40-50 epochs on my test datasets. My best experience so far is with GAN-QP which seems rather stable way beyond the other's I tried.

Thanks again for your answers they are very helpful.

akanimax commented 5 years ago

@OranjeeGeneral,

You are most welcome.

Best regards, @akanimax

akanimax commented 5 years ago

@OranjeeGeneral, I am closing this issue for now. But please feel free to comment if you face any issues over this.

Best regards, @akanimax