Open mrgloom opened 9 years ago
There is pull request https://github.com/rasmusbergpalm/DeepLearnToolbox/pull/131 for adding ReLU to regular feed-forward network (nnff() and nnbp()). Maybe you can borrow some ideas from there. Certainly you need to add ReLU support to backprop as well.
My experience with DeepLearnToolbox CNN code is, that it is unbearably slow and rather limited. For example it doesn't support fully-connected layers at all. You may have better luck with MatConvNet, which seems to be quite full-featured, but admittedly more complex.
Is it possible to replace sigm activation function to relu in CNN? I tried to replace sigm to relu in cnnff.m but it doesn't work.
I guess this also requires changes to the backprop derivatives?