eriklindernoren / PyTorch-GAN

PyTorch implementations of Generative Adversarial Networks.
MIT License
16.39k stars 4.07k forks source link

ReLU in Generator of DCGAN #123

Closed chetanpandey1266 closed 4 years ago

chetanpandey1266 commented 4 years ago

The original paper recommends the use of ReLU in Generator of DCGAN but in implementation LeakyReLU is used. Is there a theoretical reason for this or just practical advantage

gordicaleksa commented 4 years ago

Just try out both and see what you get. People tend to copy/paste stuff because somebody else said LeakyReLU with slope 0.2 works in some context so they assume it just works in another context as well (and it usually does, it's hard to notice significant difference because we still don't have good ways to objectively evaluate GANs - to the best of my knowledge - IS (inception score) was an attempt and it kinda works but in a really specific setting)

chetanpandey1266 commented 4 years ago

Okk I'll try