This is a simple intuitive method to solve linear equations using recursive least squares
Checkout the step by step video tutorial here: https://youtu.be/4vGaN1dTVhw
Algorithmically, this method is faster than matrix inversion due to the fewer operations required. However, in practice, it's hard to fairly compare this method with the already established linear solvers because many optimalization tricks have been done at the level of the hardware for matrix operations. We added a simple C++ implementation using Eigen library to compare the performance of this method to the matrix inversion method.
Inspired from the following post by whuber: https://stats.stackexchange.com/q/166718
There is an example usage at the end of RLS_Neural_Network.py which showcases how this network can learn XOR data in a single iteration. Run the code and see the output.
Advantages of using RLS for learning instead of gradient descent
Disadvantages: