The main part of this is making apical predictions dependent on corresponding lateral predictions, which may or may not be the final solution but certainly makes more sense than the current approach.
I added an option for temporal pooling to continue even in the face of bursting, in which case presumably it would be reset manually.
TP levels are set proportionally to the initial total excitation. May become irrelevant if we switch to Yuwei/Scott's idea of persistent synapses instead of persistent cells.
Also added an option for immediate learning of novel sequences, which comes at a cost of reusing cells from other contexts. Very experimental (even more than everything else).
The main part of this is making apical predictions dependent on corresponding lateral predictions, which may or may not be the final solution but certainly makes more sense than the current approach.
I added an option for temporal pooling to continue even in the face of bursting, in which case presumably it would be reset manually.
TP levels are set proportionally to the initial total excitation. May become irrelevant if we switch to Yuwei/Scott's idea of persistent synapses instead of persistent cells.
Also added an option for immediate learning of novel sequences, which comes at a cost of reusing cells from other contexts. Very experimental (even more than everything else).