codeplea / genann

simple neural network library in ANSI C
https://codeplea.com/genann
zlib License
2k stars 237 forks source link

Implement relu #32

Open kasey- opened 5 years ago

kasey- commented 5 years ago

Hello,

I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:

double inline genann_act_relu(const struct genann *ann unused, double a) {
    return (a > 0.0) ? a : 0.0;
}

But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0 But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?

ilia3101 commented 5 years ago

I would like ReLU in genann too, but the back propagation is also something I don't understand here :(

msrdinesh commented 4 years ago

Yes, In the code derivate of sigmoid "ddxσ(x)=σ(x)(1−σ(x))" is only implemented. I think we have to write a generic function of derivatives so that, we can add other activation functions like tanh and Relu check the code here https://github.com/kasey-/genann/blob/27c4c4288728791def0c5fd175c1c3999057ad9d/genann.c#L335 If you agree I can also work on this.

AnnieJohnson25 commented 3 years ago

Hi everyone, To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative? Would it also require any additional functions like how sigmoid has genann_act_sigmoid_cached and genann_init_sigmoid_lookup? Any advice...

doug65536 commented 1 year ago

Hi everyone, To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative? [snip]

It's right there at the top of this bug report thread, double inline genann_act_relu. (To be clear: It still doesn't work using that, because the derivative is hardcoded to sigmoid).