Closed chetanpandey1266 closed 4 years ago
Just try out both and see what you get. People tend to copy/paste stuff because somebody else said LeakyReLU with slope 0.2 works in some context so they assume it just works in another context as well (and it usually does, it's hard to notice significant difference because we still don't have good ways to objectively evaluate GANs - to the best of my knowledge - IS (inception score) was an attempt and it kinda works but in a really specific setting)
Okk I'll try
The original paper recommends the use of ReLU in Generator of DCGAN but in implementation LeakyReLU is used. Is there a theoretical reason for this or just practical advantage