accord-net / framework

Machine learning, computer vision, statistics and general scientific computing for .NET
http://accord-framework.net
GNU Lesser General Public License v2.1
4.49k stars 1.99k forks source link

(AForge) Backpropagation Algorithm assumes every layer uses the same Activation function. #241

Closed zgrkpnr closed 7 years ago

zgrkpnr commented 8 years ago
/```
/ assume, that all neurons of the network have the same activation function
            IActivationFunction function = (network.Layers[0].Neurons[0] as ActivationNeuron).ActivationFunction;

Line 227 in BackPropagationLearning.cs where the code above is present, the algorithm assumes all the layers use the same activation function which is not always the case.

cesarsouza commented 8 years ago

Thanks for reporting the issue. Yes, this is a known problem that dates back from the AForge.NET code. Ideally, I would like to re-write the Neuro module using DiffSharp (http://diffsharp.github.io/DiffSharp/). It would be way more useful and solve many problems with the current implementation.

zgrkpnr commented 8 years ago

Glad to hear that you have plans for the Neuro module. Looking forward to it.

cesarsouza commented 7 years ago

The Accord.Neuro namespace will be replaced by https://github.com/cesarsouza/keras-sharp. This issue might not be relevant once this happens, so I am closing it for now.