farizrahman4u / recurrentshop

Framework for building complex recurrent neural networks with Keras
MIT License
766 stars 218 forks source link

what is the principle of readout and teacher forcing? #75

Open slkingxr opened 7 years ago

slkingxr commented 7 years ago

Thanks for the great work! It really help me a lot. But there is still one point that I can't figure out. What is the principle of readout and teacher forcing? How can we feeding the output(or ground truth) of RNN from the previous time step back to the current time step, by using the output as features together with the input of this step, or using the output as this step's cell state? I have read the code but still it confused me.o(╯□╰)o. Hoping someone can answer for me。

farizrahman4u commented 7 years ago

see docs/teacher_force.md and docs/readout.md

Derek-Gong commented 7 years ago

@farizrahman4u In docs/readout.md: for cell in cells: lstms_output, h, c = cell([lstms_output, h, c]) which means h and c passed to next layer, but isn't c an internal state? why would it be passed to another cell? Shouldn't there be two lists contaning h and c for each layer, and cell as a function recieve the state for it own?