tristandeleu / ntm-one-shot

One-shot Learning with Memory-Augmented Neural Networks
MIT License
421 stars 94 forks source link

Persistent memory #15

Open sheetalreddy opened 6 years ago

sheetalreddy commented 6 years ago

Hi , In the paper , they mentioned a paragraph about persistent memory and the results they got with them. I was wondering about how to keep the external memory persistent and not wipe it off across episodes if i have a different label space across episodes . Is it even possible to work it around in such a way ?

tristandeleu commented 6 years ago

Hi! Here I am indeed resetting the memory at the beginning of each episode. To have a persistent memory across episodes, you can turn the initial memory M_0 into a tensor, and return the final memory state in memory_augmented_neural_network (maybe l_ntm_var[0][-1]).

Let me know if you have any trouble making that change.