shawntan / neural-turing-machines

Attempt at implementing system described in "Neural Turing Machines." by Graves, Alex, Greg Wayne, and Ivo Danihelka. (http://arxiv.org/abs/1410.5401)
https://blog.wtf.sg/category/neural-turing-machines/
465 stars 96 forks source link

Why Writing operation is decomposed into two parts "an erase followed by an add"? #15

Open ylqfp opened 8 years ago

ylqfp commented 8 years ago

The paper said: "Taking inspiration from the input and forget gates in LSTM, we decompose each write into two parts: an erase followed by an add". Why? Thanks!

shawntan commented 8 years ago

The LSTM has the same idea of 'forgetting' and then adding the new input. That was what was meant by that line in the paper.

Just to be clear: I was in no way involved with the writing of the paper. This repo just happens to be one of the popular implementations of the NTM currently.