MaximumEntropy / Seq2Seq-PyTorch

Sequence to Sequence Models with PyTorch
Do What The F*ck You Want To Public License
736 stars 161 forks source link

teacher forcing #18

Open zhao1402072392 opened 4 years ago

zhao1402072392 commented 4 years ago

firstly,thanks for your code,it's really helpful to me,but could i know where is the teacher forcing part,thanks again^_^

RunxinXu commented 4 years ago

@zhao1402072392 teacher forcing raito may be used during training. You can have a look at the model.py, where you can find out that the ratio is actually 1. If you want to modify it, you can modify the decoding procedure during training. That's my opinion.

manzar96 commented 4 years ago

@zhao1402072392 teacher forcing raito may be used during training. You can have a look at the model.py, where you can find out that the ratio is actually 1. If you want to modify it, you can modify the decoding procedure during training. That's my opinion.

Excuse me, but i have one question. Teacher forcing as you said may be used during training (or validation?). However on testing time no teacher forcing should be used. Am i right? In this implementation there is no use of teacher forcing(?). Correct me if I am wrong. Thanks in advance for your time.