Closed abdel closed 3 years ago
But it is required
i see, it would have the actual value 'required'. don't fully understand how that works, but it's clear that it wasn't a bug
On Wed, 14 Jul 2021 at 17:41, Gideon Kowadlo @.***> wrote:
But it is required
-- Gideon Kowadlo http://gideon.kowadlo.net
is_hebbian_perforant
to the inside ofAHA
to fix the issue withself
callLocalOptim
to facilitate manual weight updates, in a PyTorch-y waylearning_rules.py
to hold all the relevant 'local learning rules', e.g. Oja, HebbOjaRule
) and utilised that as basis for the second learning rulePureHebb
as the 'pure Hebbian learning' rule is described in the Ketz2013nn.Linear
as theLocalConnection
class to allow us to default to untrainable weights and better control the initialisationdg_ca3
andec_ca3
usingLocalConnection
, and utilise the newLocalOptim
to control weight updates for each set of connectionspc_out
pc_out
as the 'target' with respect to the corresponding connection input (e.g.dg
orec_in
loss/dg_ca3
,loss/ec_ca3
andloss/pc
PC
buffer andPR
implementations behind the negation of theis_hebbian_perforant
flaglake
to ensure it is not 'assuming' thatPR
existsaha_config_hebb.json
with increased study steps and removal ofPR
metrics