damientseng / Seq2Seq-Chatbot

A theano implementation of the neural conversational model
MIT License
21 stars 16 forks source link

about seq2seq #4

Open DvHuang opened 8 years ago

DvHuang commented 8 years ago

sorry for my poor English. Thanks your code again. could I ask another question? About the seq2seq algorithm. In my opinion ,It use the last word's hidden state to represent a sentence(no matter forward or reversed lats ),then use another lstm to decode it. In your code it is : 2

does it mean your use every word(not the last word)to the next lstm to decode