Cerenaut / pt-aha

AHA implemented in PyTorch. New features are being implemented here (not in the TF version).
Apache License 2.0
0 stars 2 forks source link

CA3-CA1 parameters #16

Closed anhhuyalex closed 2 years ago

anhhuyalex commented 3 years ago

Hi, I'm curious why the CA3-CA1 mapping uses the self.config['ca1'] parameters but the learning rates of 'ca3_ca1'. I can't find self.config['ca3_ca1'] being used anywhere else. https://github.com/Cerenaut/pt-aha/blob/main/cls_module/cls_module/memory/stm/aha/msp.py

# Build the CA1 sub-module, to reproduce the EC inputs
self.ca3_ca1 = SimpleAutoencoder(ca3_shape, self.config['ca1'], output_shape=ca1_output_shape)
self.ca3_ca1_optimizer = optim.Adam(self.ca3_ca1.parameters(),
                                    lr=self.config['ca3_ca1']['learning_rate'],
                                    weight_decay=self.config['ca3_ca1']['weight_decay'])
abdel commented 2 years ago

This has been fixed, thanks for flagging it!