cazala / synaptic

architecture-free neural network library for node.js and the browser
http://caza.la/synaptic
Other
6.92k stars 666 forks source link

Why does HLIM's derivative equals 1 ? #266

Open TibboddiT opened 7 years ago

TibboddiT commented 7 years ago

Maybe I misunderstood something, but it seems to me that the HLIM's derivative should be equals to 0 when x != 0.

Am I wrong ?

wagenaartje commented 7 years ago

Judging from Wikipedia (see: Binary Step), that assumption is definitely correct:

But when the derivative of all x != 0 is equal to 0, then the error calculation will also return 0 for any neuron with Neuron.squash.HLIM. So basically, the values (weight, bias) won't change at all when the derivative is 0.

So my only guess is: the derivative for x != 0 is chosen to be 1 to allow propagation of neuron values.

TibboddiT commented 7 years ago

Yep, I understand. I always thought using this activation function for any other node than those in the output layer was not a good idea anyway.

And this is the way i wanted to use it: with a 0 bias, and not backprop sensitive, just to make the output 0 or 1... pretty useless since this can easily be done post-processing the output.

To myself: you can make the sigmoid function very close to the step function, just use a big lambda !

Should I close this issue ? (the derivative is still not correct)