graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.07k stars 3.91k forks source link

A question about transformer #27

Closed goodnessSZW closed 5 years ago

goodnessSZW commented 5 years ago

Hey body, I've read the codes of Transformer. That's cool. Here's something I can't understand about the input of decoder. It can be acceptable that we use ' S i want a beer ' as the decoder_input in the training period. However, in the test period, the decoder_input should start with an 'S' and then we use the predicted result of 'S' which passed through decoder as the next input of decoder instead of using the whole translated sentence as the decoder_input. Because you can't use the translated sentence in any part of the model except the last part of comparsion in test, predicting period. That's what I understand and I have no idea I'm right or wrong since I've seen that the parameters of the forward function of class Transformer include 'dec_inputs'. If I'm right, another function to predict the translated sentences is better to be created. What do you think?

graykode commented 5 years ago

Did you mean role of Best-First-Search Decoder?

goodnessSZW commented 5 years ago

Yep,just as you coded in Transformer(Greedy_decoder), I ignored it...haha..embarrassed. So, another question, greedy decoder is what we used in real project training and test or just test?

graykode commented 5 years ago

Please search about difference between Teacher forcing and Non-Teacher forcing. It'll help you.

It does not matter greedy or not on training, but using teacher forcing make to be more fastly converaged.