Closed LonglongaaaGo closed 3 years ago
Hi I'm not sure about this statement "InstanceNorm is better than BN in the image synthesis task". Please feel free to replace BN with IN and see how it turns out.
Thank you! I found that the hinge loss of the Wgan incorporates the random noise like "torch.rand_like(pred) * 0.2". So can you share some reason for it? Thank you!
It's about relaxing the boundary between real and fake labels. Think about it as a label smoothing trick.
Hello! Thanks for your beautiful work. I noticed that in your network, there are BatchNorms in the whole network. So I confused why you do not use the InstanceNorm since InstanceNorm is better than BN in the image synthesis task?