cneuralnetwork / Neural-Networks-in-every-Language

simply neural networks in every language
MIT License
26 stars 15 forks source link

Added activation functions. #17

Open VedantYadav025 opened 1 month ago

VedantYadav025 commented 1 month ago

Added tanh and and ReLU activation functions, and their derivatives.

I think we could make a MNIST classifier with the functionality which we have in the repo (C++) right now. I think it's best to try not to add classes and stuff to the code, and directly make the classifier without trying templatizing the code.

VedantYadav025 commented 1 month ago

Also, question for the maintainers. Should I write functions where sigmoid will be accept a std::vector as a function, or can we accept anybody who is making neural nets using this code to iteratively add sigmoid (and other actiavtion functions) value by value to the std::vector (or any other C++ container in general)