number9473 / nn-algorithm

algorithm for neural network
259 stars 59 forks source link

Self-Normalizing Neural Networks #110

Open joyhuang9473 opened 7 years ago

joyhuang9473 commented 7 years ago

Self-Normalizing Neural Networks

The activation function of SNNs are "scaled exponential linear units" (SELUs), which induce self-normalizing properties.

SNNs work well for architectures with many layers, allowed us to introduce a novel regularization scheme, and learn very robustly.

joyhuang9473 commented 7 years ago

HolmesShuan/SNNs-Self-Normalizing-Neural-Networks-Caffe-Reimplementation: https://github.com/HolmesShuan/SNNs-Self-Normalizing-Neural-Networks-Caffe-Reimplementation