loudinthecloud / pytorch-ntm

Neural Turing Machines (NTM) - PyTorch Implementation
BSD 3-Clause "New" or "Revised" License
582 stars 128 forks source link

Why did each batch has a memory? #21

Open MrYaoH opened 1 year ago

MrYaoH commented 1 year ago

Dear author: Your code of NTM is a pretty work, and its structure is concise and easy to follow. But I have a little confusion about why each batch has a memory? Why not using only a memory for every batch, just like a LSTM but just expanding the memory cell size? Could you help me address this confusion? Thank you very much!

wabbajack1 commented 3 months ago

Consider why this makes sense on your computer: your computer often has multiple RAM modules, so an NTM could also be designed with additional RAM. Additionally, processing in batches is much more efficient than processing without batching.