Open joyhuang9473 opened 7 years ago
The activation function of SNNs are "scaled exponential linear units" (SELUs), which induce self-normalizing properties. SNNs work well for architectures with many layers, allowed us to introduce a novel regularization scheme, and learn very robustly.
The activation function of SNNs are "scaled exponential linear units" (SELUs), which induce self-normalizing properties.
SNNs work well for architectures with many layers, allowed us to introduce a novel regularization scheme, and learn very robustly.
HolmesShuan/SNNs-Self-Normalizing-Neural-Networks-Caffe-Reimplementation: https://github.com/HolmesShuan/SNNs-Self-Normalizing-Neural-Networks-Caffe-Reimplementation
Self-Normalizing Neural Networks