Closed ghost closed 8 years ago
Hi, can it be that your derivative of the elu activation function is wrong?
elu: (alpha=1 is left out the equation) forward: x >= 0 ? x : exp(x) - 1 correct backward: x >= 0 ? 1 : exp(x) instead of x >= 0 ? 1 : x + 1
thanks, filip
Sorry, made a mistake, your implementation is completely correct. Just tested with MNIST dataset and giving good results!
Hi, can it be that your derivative of the elu activation function is wrong?
elu: (alpha=1 is left out the equation) forward: x >= 0 ? x : exp(x) - 1 correct backward: x >= 0 ? 1 : exp(x) instead of x >= 0 ? 1 : x + 1
thanks, filip