tsoding / nn.h

Simple stb-style header-only library for Neural Networks
MIT License
336 stars 34 forks source link

Autograd engine #11

Open AkashKarnatak opened 1 year ago

AkashKarnatak commented 1 year ago

Right now the backpropagation formulas are hardcoded, but as the library grows with the addition of new layers, cost functions and optimizers it will not be feasible for you keep track of all formulas. One solution is to implement an automatic differentiation engine (autograd engine) which will take care of backpropation. This is how modern deep learning frameworks like pytorch handles it.

You can look at micrograd which is a minimal autograd engine written in 100 lines of python code.