Closed tessavdheiden closed 6 years ago
This is done for label smoothening as suggested here: https://github.com/soumith/ganhacks#6-use-soft-and-noisy-labels
Hi Agrim!
Thanks. Unfortunately for WGAN* (in order to get a better convergence of the G gradient), I cannot do this trick (the WGAN loss function is computed over the scores only, not the labels).
Have you looked into other models (for instance VGAN)? It would be great to know, so I can learn from your experience :).
*Changed the loss functions, added gradient clipping of the D, changed optimizer to RMSprop.
Hi Agrim!
I saw that you added noise (uniform) to the targets and labels. Can you explain why you do it and how the ranges for these uniform distributions are chosen?
From losses.py: