cazala / synaptic

architecture-free neural network library for node.js and the browser
http://caza.la/synaptic
Other
6.91k stars 666 forks source link

Custom Activation Function #244

Open NarendraPatwardhan opened 7 years ago

NarendraPatwardhan commented 7 years ago

I wish to define a new polynomial based activation function which is constrained well between -1 and 1 like the predefined functions in the library. However the function defined as follows gives a constant error unlike predefined activation functions in the library.

Neuron.squash.MYACT = function(x, derivate) {

if(derivate)

return (1.05114 - (0.0917294*Math.pow(x,2)) - (0.115071* Math.pow(x,4)));

return ((1.05114 x) -(0.0305765 Math.pow(x,3)) - (0.0230142 *Math.pow(x,5)))};

Could you guide me towards how to properly define an activation function in synaptic/ add a closure to define activation function in the library? (sort of function which spits out activation function)

wagenaartje commented 7 years ago

The predefined activation functions in the library aren't actually constrained to (-1,1). Only one of them (TANH) is. LOGISTIC and HLIM have (0,1) and IDENTITY and RELU can get higher than 1.

Besides, your custom activation function is NOT constrained to (-1,1). Your function assumes that x will be within the range (0,1). Which is not the case (and this has nothing to do with the ranges of activation functions). Note that activation values of neurons always get multiplied by a weight (which can be larger than 1) and get added together to form the input of another neuron, so values can be infinitely small/large.

Besides, you should at least test how well your activation function works on multiple datasets.

Some links:

Once you are more familiar with how neural networks work, you can start creating your own activation functions. Please do mind that making an activation function is easy, but it takes a lot of effort to test if it actually works on the majority of datasets out there. A lot of research is done to create new activation functions, like SELU.

NarendraPatwardhan commented 6 years ago

Thank you, and indeed you are right that said function is not constrained to (-1,1) like Tanh is. I have modified it so that it would be so. Could you please tell me if the method I used for defining the activation function (code) is correct? If not could you please guide me to any resource on how to create activation function in Synaptic??