htm-community / comportex

Hierarchical Temporal Memory in Clojure
153 stars 27 forks source link

select cells consistently, esp. when beginning sequences #34

Closed floybix closed 8 years ago

floybix commented 8 years ago

When beginning a sequence (or after a sequence reset/break), there is no distal input, so no basis for choosing a winner/learning cell in each column. Cells are then chosen at random.

That random selection is a problem because when the same sequence is presented several times (in isolation) they will begin on different cells; and will consequently not reinforce previous learning, but will have partial learning spread across several cells. This can be seen in repeated sequence demos, where the whole sequence is learned but it keeps bursting.

Proposal - I think it would be better to start on the same cell consistently. The first cell.

Perhaps more generally the choice of winner/learning cell (when there are no predictive cells in a column) should not be completely random but should be a deterministic function of the set of previously-active cells. And it should be a robust function, so that similar activity consistently selects the same cells.

Proposal - Select cell number as (mod depth) of each distal input bit, and take the mode of that. Offset by the current column number (mod depth again), otherwise all cells would be synchronised and we lose combinatorial capacity (see #31).

Needs testing.

floybix commented 8 years ago

(just to clear this up)... groan. This is embarrassing. You are right @subutai and I find it disturbing that I had confused myself like that. After investigating why I saw that result from my test, I found a bug in Comportex where it was reinforcing only against learning cells, instead of all active cells (fixed in 3e6096b).

Anyway, it is still worthwhile for me to start on consistent cells after a reset, if only to display those states consistently on my Cell SDRs diagram.

cogmission commented 8 years ago

@floybix I still don't understand how the previously trained "e" cell (the presynaptic cell for the previously trained "h" cell) gets found and has its segments reinforced? What am I missing, because it appears to me like a new presynaptic cell-to-segment relationship is going to be formed every time the same sequence is entered?

floybix commented 8 years ago

@cogmission Let's whiteboard it at the meetup.

subutai commented 8 years ago

After investigating why I saw that result from my test, I found a bug in Comportex where it was reinforcing only against learning cells, instead of all active cells

@floybix No worries. This stuff is extremely tricky and very easy to miss. Even with bugs like this the overall system often still generally works ok, which makes it pretty hard to debug. Believe me we've had our share of bugs like this too. See you at the meetup!