Open lennert130994 opened 5 years ago
Hi, have you figured this out? I am trying to train a GAN and my losses look somewhat similar to your second figure.
I am using a u-net for generator and 8-layer CNN for discriminator. I was also thinking that it may be happening because discriminator has too many layers. My understanding is the generator has much complex task than the discriminator (generating samples vs simple binary classification), which is why discriminator needs to be simpler to strike the balance. Thank you for your reply. If you got something to add, or think I'm thinking in the wrong direction. please let me know. Thanks again!
So i have a simple GAN with dense layers and leakyReLU activation, nothing special. When the GAN is training correctly and the generated samples look good, the training loss looks like this: With blue being D and orange G.
A lot of times the GAN training goes wrong and the generated sample look like, well, shit. The training loss in this case looks like this:
Any help to what might cause this would be greatly appreciated!