tristandeleu / ntm-one-shot

One-shot Learning with Memory-Augmented Neural Networks
MIT License
421 stars 94 forks source link

zero reset of least used memory #1

Closed ywatanabex closed 8 years ago

ywatanabex commented 8 years ago

Thank you for sharing your great code. I have a question.

Between Eq (7) and (8) in section 3.2, the paper says that "Prior to writing to memory, the least used memory location is computed from wu_t−1 and is set to zero." I looked into omniglot.py but I couldn't find the corresponding part. Is there any reason for this?

tristandeleu commented 8 years ago

Indeed, nice catch! I didn't notice that. I'll make the change shortly. Thank you very much for pointing this out!