Open LexieYang opened 4 years ago
Thank you for the suggestion. I couldn't find any such description in the paper, but I've seen it in other implementation, so I think your suggestion is correct.
Have you tried it?
That would sound more like a tanh activation function rather than sigmoid, which is common for gans.
Do we need to normalize the training images into the range of (-1,1) before feed them into the network?