arvoelke / nengolib

Nengo library of additional extensions
Other
29 stars 6 forks source link

Recursive least-squares learning in Nengo #133

Closed arvoelke closed 6 years ago

arvoelke commented 6 years ago

@psipeter @celiasmith

This shows how to implement recursive least-squares (RLS) as a learning rule in Nengo. The equations come from the Sussillo and Abbott (2009) FORCE paper. See committed notebook for details.

It appears to work extremely well, even with spiking neurons:

simulation

error

gamma

This is learning a communication channel. The error signal is disabled after 1 period of the sine wave, and the network gives the correct answer thereafter. The gamma matrix it finds (see above) is essentially identical to the default computed offline by Nengo. This makes sense given that they are both doing least-squares optimization, but it is still remarkable given we're using spiking neurons, only providing one oscillation, and doing this online.

You can think of this as an alternative to using PES, with the following important differences:

In other words, this should consistently outperform PES, but is not biologically plausible, and requires extra compute / memory. If the online aspect is not required, then just stick to Nengo's default (offline) L2-optimization. I think this will be most useful when doing FORCE-style learning.

arvoelke commented 6 years ago

Note that, from a Nengo user's perspective, this is as simple as substituting nengo.PES(...) with nengolib.RLS(...). They both take the same parameters, use the same sign on the error signal, and have the same overall effect of minimizing that error signal over time!

celiasmith commented 6 years ago

Great. Yup this is as it should be, like you mentioned :) Good summary of differences/strengths/weaknesses. best, .c

On Wed, Jan 17, 2018 at 5:43 PM, Aaron Russell Voelker < notifications@github.com> wrote:

Note that, from a Nengo user's perspective, this is as simple as substituting nengo.PES(...) with nengolib.RLS(...). They both take the same parameters, use the same sign on the error signal, and have the same overall effect of minimizing that error signal over time!

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/arvoelke/nengolib/pull/133#issuecomment-358474568, or mute the thread https://github.com/notifications/unsubscribe-auth/AB5JU914cvokTAWH1N58ktv954eiIVPSks5tLneYgaJpZM4RiGf5 .

tbekolay commented 6 years ago

Wow, this is super cool! Nice work @arvoelke :+1:

arvoelke commented 6 years ago

TODO:

arvoelke commented 6 years ago

The documentation now includes a side-by-side comparison of PES versus RLS on a scalar spiking communication channel. You won't be able to see this until release, unless you build the docs yourself, and so I've copied it below:

rls_versus_pes

This PR also contains a notebook example that shows how to construct both spiking FORCE and full-FORCE networks in Nengo. Again, the rendered version will be visible upon next release (nengolib>0.4.2).

codecov-io commented 6 years ago

Codecov Report

Merging #133 into master will not change coverage. The diff coverage is 100%.

Impacted file tree graph

@@          Coverage Diff          @@
##           master   #133   +/-   ##
=====================================
  Coverage     100%   100%           
=====================================
  Files          29     29           
  Lines        1373   1374    +1     
  Branches      157    157           
=====================================
+ Hits         1373   1374    +1
Impacted Files Coverage Δ
nengolib/temporal.py 100% <100%> (ø) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update f9059ee...57f18b6. Read the comment docs.