harvardnlp / seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
http://nlp.seas.harvard.edu/code
MIT License
1.26k stars 278 forks source link

Training Seq2Seq models for modelling conversations #39

Closed vikram-gupta closed 8 years ago

vikram-gupta commented 8 years ago

Hello,

This is not an issue but a call for some guidance. Has anyone tried using this amazing project to train seq2seq models for generating conversations like [https://arxiv.org/pdf/1506.05869.pdf]

Thanks

yoonkim commented 8 years ago

i've experimented a little bit with dialogue and i observe similar results to those of non-attentional seq2seq models--responses tend to be quite generic ("i don't know." "yes." "no." etc,)

vikram-gupta commented 8 years ago

Thanks @yoonkim