hughperkins / DeepCL

OpenCL library to train deep convolutional neural networks
Mozilla Public License 2.0
866 stars 200 forks source link

elu activation #46

Closed ghost closed 8 years ago

ghost commented 8 years ago

Hi, can it be that your derivative of the elu activation function is wrong?

elu: (alpha=1 is left out the equation) forward: x >= 0 ? x : exp(x) - 1 correct backward: x >= 0 ? 1 : exp(x) instead of x >= 0 ? 1 : x + 1

thanks, filip

ghost commented 8 years ago

Sorry, made a mistake, your implementation is completely correct. Just tested with MNIST dataset and giving good results!