jorgenkg / python-neural-network

This is an efficient implementation of a fully connected neural network in NumPy. The network can be trained by a variety of learning algorithms: backpropagation, resilient backpropagation and scaled conjugate gradient learning. The network has been developed with PYPY in mind.
BSD 2-Clause "Simplified" License
297 stars 98 forks source link

Theoretical background links in documentation #15

Closed hwsamuel closed 7 years ago

hwsamuel commented 7 years ago

It would be very useful, especially for newcomers to neural networks, to have links pointing to the theoretical background for some of the concepts in the documentation, e.g. each of the activation functions, each of the learning algorithms, etc. Right now the documentation is very well written but more focused on how to use the functions available without giving some primer of what the concepts mean and differences between them

jorgenkg commented 7 years ago

I can see that it would help novice users in understanding when to apply the various cost and activation functions, however this would be well beyond the scope of this hobby project.

Most of my theoretical background comes hard earned from reading scientific papers in addition to the books Machine Learning: A Probabilistic Perspective and Artificial Intelligence: A Modern Approach.

Nevertheless, anyone who are willing to invest their time into extending the documentation with theoretical background information are very welcome to create a PR.