soumith / ganhacks

starter from "How to Train a GAN?" at NIPS2016
11.43k stars 1.66k forks source link

Unexplainable GAN losses, need help #57

Open lennert130994 opened 5 years ago

lennert130994 commented 5 years ago

So i have a simple GAN with dense layers and leakyReLU activation, nothing special. When the GAN is training correctly and the generated samples look good, the training loss looks like this: image With blue being D and orange G.

A lot of times the GAN training goes wrong and the generated sample look like, well, shit. The training loss in this case looks like this: image

Any help to what might cause this would be greatly appreciated!

shaurov2253 commented 4 years ago

Hi, have you figured this out? I am trying to train a GAN and my losses look somewhat similar to your second figure.

lennert130994 commented 4 years ago
Hi, In my case it was caused by the GAN being to complex for the input data I fed it.So by doing some testing with smaller and less layers I eventually solved it.Also you need to play around with the complexity of the discriminator and generator so that they are balanced/competing. Let me know if you need more help. Regards,Lennert Sietsma  Van: shaurov2253Verzonden: donderdag 8 oktober 2020 02:17Aan: soumith/ganhacksCC: Lennert Sietsma; AuthorOnderwerp: Re: [soumith/ganhacks] Unexplainable GAN losses, need help (#57) Hi, have you figured this out? I am trying to train a GAN and my losses look somewhat similar to your second figure.—You are receiving this because you authored the thread.Reply to this email directly, view it on GitHub, or unsubscribe. 
shaurov2253 commented 4 years ago

I am using a u-net for generator and 8-layer CNN for discriminator. I was also thinking that it may be happening because discriminator has too many layers. My understanding is the generator has much complex task than the discriminator (generating samples vs simple binary classification), which is why discriminator needs to be simpler to strike the balance. Thank you for your reply. If you got something to add, or think I'm thinking in the wrong direction. please let me know. Thanks again!