Open nashory opened 6 years ago
normalize the weights of the previous layer dynamically. need to be implemented.
Can I use simply use torch nn.WeightNorm for equalized learning rate in the paper??
tested torch nn.WeightNorm, but it seems to harm training.
I found (https://github.com/stormraiser/GAN-weight-norm) maybe we can use this.
(https://github.com/torch/nn/blob/master/WeightNorm.lua)