bioinf-jku / SNNs

Tutorials and implementations for "Self-normalizing networks"
GNU General Public License v3.0
1.58k stars 199 forks source link

batch normalization #1

Closed zhly0 closed 7 years ago

zhly0 commented 7 years ago

@bioinf-jku,thank you for your nice work! I am a new to deep learning,and I have some simply questions,since the net in your test code is not very deep,It makes no big different to add batch normalization layers after each convolution layers,but if the net is very deep,is it necessary to add batch normalization layer after each convolution layers?Or there is no need to do so since the activation function selu has the ability to batch normalize the input layer? thank you in advance!

gklambauer commented 7 years ago

Hello jhly0, Yes, you are right that in the tutorial cases that we present the architectures are not too deep. For fully-connected networks, batchnorm is not necessary. For CNNs, there are people who left batchnorm in as a regularizer. There is a discussion about this on reddit: https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/?st=j3wrdmix&sh=d2824d1a Regards.

zhly0 commented 7 years ago

@gklambauer thanks for reply!