keon / seq2seq

Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
MIT License
689 stars 172 forks source link

No teacher forcing during evaluation #9

Closed pskrunner14 closed 6 years ago

pskrunner14 commented 6 years ago

Added is_eval flag to seq2seq forward function so the model does'nt use teacher forcing during evaluation. Fixes #8

keon commented 6 years ago

why not just simply add teacher_forcing_ratio=0 while evaluation? I am not a fan of adding more variables that does that same thing (is_eval, teacher_forcing_ratio).

pskrunner14 commented 6 years ago

Yeah you're right, it is kinda redundant overhead on the model, I thought there should be an explicit flag for evaluation mode. On it.

keon commented 6 years ago

@pskrunner14 thanks :)