igul222 / improved_wgan_training

Code for reproducing experiments in "Improved Training of Wasserstein GANs"
MIT License
2.35k stars 669 forks source link

Please see the result of gan_64x64.py #10

Closed kingofoz closed 7 years ago

kingofoz commented 7 years ago

Hi @igul222 Please see the generated samples at iteration 199999. Is the result good? I am not sure what are generated. :-) At the final iteration 199999, train disc cost is -1.49 and dev disc cost is -1.6. Is this good? I am not sure how to choose the best model in all iterations.

Thanks, Yingjun samples_199999

igul222 commented 7 years ago

Those look like reasonable samples to me. Because 64x64 Imagenet is very diverse, nobody has managed to generate anything more than "vaguely realistic colorful blobs" yet. Re. the cost numbers, those sound reasonable, but using them for architecture search isn't yet a good idea because they depend on the critic architecture. For ImageNet, I'd just take the last iteration since overfitting is unlikely to be a major concern.

kingofoz commented 7 years ago

Thanks, I got it! @igul222 For the the other dataset, for example, Toy, MNIST, gan_language.py and CIFAR-10, how do you choose a good architecture?

Thanks, Yingjun