Closed nabihach closed 8 years ago
Hi @nabihach, as you can see on "Sequence to Sequence Learning with Neural Networks" article, the authors recommend to use the input sequence reversed to better performance.
http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf
Thanks for pointing this out!
It introduces easier handling of short-term dependencies by feeding in a reverse sequence. From a linguistic point of view this is interesting as well.
In https://github.com/macournoyer/neuralconvo/blob/master/dataset.lua, line 213, why is the input sequence being reversed? Suppose we have the following conversation:
A: How are you B: im fine
If I'm understanding this correctly, the following input-target pair is being added to the training set: {input= you are how ; target= im fine}
According to my understanding, the following pair should be added: {input= how are you ; target= im fine}