Closed iver56 closed 5 years ago
It isn't included yet.
One nice way to include this which doesn't require much work is to extend the signature of activation functions and in-place activation functions to include kwargs. Currently, they cannot be parameterized. Once this feature is in, we can not only easily do lrel
, but also things like scaled sigmoid/tanh, clipped relu etc., by just adding them to handlers.
I'm a bit swamped right now and would like to spend time on docs when I have it. If anyone would like to take a crack at it, I can review. Otherwise, I'll come back to it later.
So far, I've tried the
linear
,rel
(rectified linear),tanh
andsigmoid
activation functions in brainstorm. I've also looked for a built-in leaky rectified linear unit activation function (because it is a key component in a winning solution in a CIFAR-10 competition [1]), but couldn't find it. Is there such a thing? It would be a nice thing to have :+1:[1] http://blog.kaggle.com/2015/01/02/cifar-10-competition-winners-interviews-with-dr-ben-graham-phil-culliton-zygmunt-zajac/