igul222 / improved_wgan_training

Code for reproducing experiments in "Improved Training of Wasserstein GANs"
MIT License
2.35k stars 668 forks source link

G loss increases much suddenly? #52

Closed LynnHo closed 4 years ago

LynnHo commented 6 years ago

@igul222

I have implemented wgan-gp by myself, In my application, G loss sometimes increases much suddenly image

But D loss is still stable image

And the gradient penalty is also stable image

Any idea? thanks!

crisbodnar commented 6 years ago

I experienced the same outcome in my implementation when I added learning rate decay every 25k iterations. Whenever the learning rate decreases the loss of G goes up, the real loss of D goes up, the fake loss of D goes down (- loss G) but the overall loss of D continues to increase as normal.

Any idea what could cause this @igul222 @martinarjovsky ? Is there some correct way of doing this learning rate decay? In the paper, it is mentioned that you implemented learning rate decay for the ResNet-101.

Did you manage to find a fix @LynnHo ?

LynnHo commented 6 years ago

@crisbodnar I have found recently in my work that layer normalization may cause this phenomenon, and instance normalization may help.