Open VedantYadav025 opened 1 month ago
Also, question for the maintainers. Should I write functions where sigmoid will be accept a std::vector as a function, or can we accept anybody who is making neural nets using this code to iteratively add sigmoid (and other actiavtion functions) value by value to the std::vector (or any other C++ container in general)
Added tanh and and ReLU activation functions, and their derivatives.
I think we could make a MNIST classifier with the functionality which we have in the repo (C++) right now. I think it's best to try not to add classes and stuff to the code, and directly make the classifier without trying templatizing the code.