Open breznak opened 9 years ago
Sounds interesting... @scottpurdy ?
Sounds interesting. Feel free to experiment. For us to consider it in NuPIC you will need to show that it has some real benefit without break existing applications. And that it is biologically plausible. Working on getting a more descriptive set of requirements for algorithmic changes published by Jeff/Subutai.
Improve the way we can hint forgetting of sequences. Or setting contextual closeness.
The current means to do so are:
r
stream, ormodel.sequenceReset()
Example use-case:
An easy example is context of words (how semantic vector is constructed in NLP): there's a sliding window grouping words that are learned together,
A quick [brown fox jumps] over a ...
[]
define the strongest contextual bond (may, or may not be correct; but thus it's done by the method - using distance). We'd like to extend it to sentences, paragraphs, pages, same document. ...Similar example can be done for ECG, or many other data.
Proposal:
Finer control over forgetting
forgetALittle()
would be enough, called after each [triplet], end of sequence/paragraph/...decaySynapsesBy(x)
but that can be tricky how it accumulates on multiple calls.Usage:
A task specific model/code calls it at multiple places.
CC @subutai @rhyolight