akanimax / pro_gan_pytorch

Unofficial PyTorch implementation of the paper titled "Progressive growing of GANs for improved Quality, Stability, and Variation"
MIT License
536 stars 100 forks source link

Output size #55

Closed minxdragon closed 3 years ago

minxdragon commented 3 years ago

Hi @akanimax Is there anywhere that the output size of generated samples is specified? can I increase it at all? Thanks!

akanimax commented 3 years ago

Hey, Changing the depth of the network produces higher resolution images. Note that you can only increase the sizes of the generated samples in powers of 2 (akin to mip-maps). And, you do need to retrain the network on this new higher resolution.

minxdragon commented 3 years ago

thank you!

minxdragon commented 3 years ago

hmm, whichever depth size I try I get _AssertionError: batchsizes are not compatible with depth. do I need to change the Batch_sizes as well? I'm trying every power of 2 I can think of!

akanimax commented 3 years ago

Oh yeah, you do need to change all the progressive growing parameters to be compatible with the chosen depth. batch_sizes is indeed one of them. Afair, the other two should be epochs and fade-in_percentages. You don't need to change the values there, but just make sure that these list sizes are equal to the queried depth.

dhawan98 commented 3 years ago

What would be a good combination of depths and batch_sizes?

akanimax commented 3 years ago

Also: #57 Please refer to the Progressive growing of GANs paper for the hyperparameters to be used with the specific datasets. Afair, CIFAR-10 is indeed one of the benchmarks.

Closing this issue for now.