graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.03k stars 3.9k forks source link

A questions about decoder in seq2seq-torch #36

Open acm5656 opened 5 years ago

acm5656 commented 5 years ago

https://github.com/graykode/nlp-tutorial/blob/6e171b914ed28eb04dc9176916a99e9a996a7951/4-1.Seq2Seq/Seq2Seq-Torch.py#L92 Hi,I‘m a nlp rookie.I want to ask you a question.I read the seq2seq's paper,which use t-1 output as the t input in decoder. Your code in this line use 'SPPPPP' as the decoder input.So,is this way harm to the result? If you see this issues, please answer me in your free time. Although my english is poor, I still want to express my gratitude to you. image

Angry-Echo commented 1 year ago

Hi,after several years, you must understand the code. So i want to ask you a question. I think the code is different from the paper "Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation". In the paper, the summary of the encoder is used in every cell of the decoder whatever hidden or output. However, I did not see it show in the code. Is that right?