oxford-cs-ml-2015 / practical6

Practical 6: LSTM language models
https://www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/
260 stars 83 forks source link

Why is LSTM final state's backward message (dloss/dfinalstate) 0? #6

Open sunshineatnoon opened 8 years ago

sunshineatnoon commented 8 years ago

Thanks for this code, it's very clear. But I don't understand these two lines:

-- LSTM final state's backward message (dloss/dfinalstate) is 0, since it doesn't influence predictions
local dfinalstate_c = initstate_c:clone()
local dfinalstate_h = initstate_c:clone()

Why is LSTM final state's backward message (dloss/dfinalstate) 0?