Cerenaut / pt-aha

AHA implemented in PyTorch. New features are being implemented here (not in the TF version).
Apache License 2.0
0 stars 2 forks source link

Typo in aha "forward_memory"? #12

Closed chengxuz closed 3 years ago

chengxuz commented 3 years ago

Hi,

Thanks for this interesting model on hippocampus. I am reading over your codes to understand its logic and find this line confusing: https://github.com/Cerenaut/pt-aha/blob/main/cls_module/cls_module/memory/stm/aha.py#L460. I guess it's a typo and should be: post_pc_cue = post_dg_ca3_out + post_ec_ca3_out instead of the current post_pc_cue = post_ec_ca3_out + post_ec_ca3_out?

Best, Chengxu

abdel commented 3 years ago

@chengxuz you're right, that is a typo. It should be combining the outputs from both pathways (DG:CA3 and EC:CA3), rather than adding EC:CA3 twice 😄

I have now fixed this in https://github.com/Cerenaut/pt-aha/commit/c37eaafe8aae2d35438be0ddade6710b7a18673e