ermig1979 / Synet

A small framework to infer neural network
MIT License
137 stars 26 forks source link

ELU activation #5

Closed gordinmitya closed 5 years ago

gordinmitya commented 5 years ago

Hey, could you check my code for the exponential linear activation? (x if x > 0 and alpha * (exp(x)-1) if x < 0, arxiv) In our case, elu shows much better accuracy in comparison with relu.

Probably there are ways to speed up exp operation as it has done with RoughSigmoid. What do you think?