spribitzer / deerlearn

python based machine learning program to analyze DEER data
MIT License
1 stars 0 forks source link

use ReLU instead of sigmoid activation function #1

Open spribitzer opened 5 years ago

spribitzer commented 5 years ago

apparently sigmoid activation functions are not how most modern neural networks are done anymore and ReLUs are easier to train

spribitzer commented 4 years ago

Using ReLu as activation for any but the last layer introduces spikey features into the distance distribution (see figure, blue is the true distribution, red the fit). image

This is most pronounced for the third layer being ReLU, and barely noticeable for the first or second layers.