macournoyer / neuralconvo

Neural conversational model in Torch
777 stars 346 forks source link

reversing the input sequence #54

Closed nabihach closed 8 years ago

nabihach commented 8 years ago

In https://github.com/macournoyer/neuralconvo/blob/master/dataset.lua, line 213, why is the input sequence being reversed? Suppose we have the following conversation:

A: How are you B: im fine

If I'm understanding this correctly, the following input-target pair is being added to the training set: {input= you are how ; target= im fine}

According to my understanding, the following pair should be added: {input= how are you ; target= im fine}

maeda commented 8 years ago

Hi @nabihach, as you can see on "Sequence to Sequence Learning with Neural Networks" article, the authors recommend to use the input sequence reversed to better performance.

http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf

nabihach commented 8 years ago

Thanks for pointing this out!

Jordy-VL commented 8 years ago

It introduces easier handling of short-term dependencies by feeding in a reverse sequence. From a linguistic point of view this is interesting as well.