geek-ai / Texygen

A text generation benchmarking platform
MIT License
863 stars 203 forks source link

On LeakGAN (non-interleaved) Pretraining #21

Open josauder opened 6 years ago

josauder commented 6 years ago

In your code, the LeakGAN generator is pretrained, and only after that, the discriminator is pretrained. If I understand correctly, this leads to useless features (i.e. noise) from the discriminator being leaked to the generator during pretraining - and those features becoming "drastically" useful after the discriminator is pretrained (i.e. at the start of adversarial training).

In the original LeakGAN code, as well as the original paper (appendix: algorithm), the authors propose to pretrain generator and discriminator interleavingly.