Closed VityaVitalich closed 5 days ago
Hi @VityaVitalich
Thanks for your interests in our project!
The teacher_forcing_seq
was used when we had a given sequence for calculating the lookback ratio. The given sequence may not be the one that would be generated by greedy decoding.
We don't use teacher forcing in the main experiments. We only use it in an experiment of Section 4, where we need to ensure that the 7B and 13B models are generating the same content when extracting lookback ratio features. So, we decode from 7B to get its outputs, and run the 13B model with the 7B outputs as teacher_forcing_seq
.
Let me know if you have more questions!
Dear authors,
Thank you for your work, I really liked it and started delving into the code. However, it is unclear for me where "teacher_forcing_seq" is used and why.
I can clearly see this as an argument for generation in the vanilla case and that it is loaded only when the argument is passed, but you do not pass this argument in README, that confuses a bit.
It is also a bit unclear what it does and do I really need to understand this :)
Thank you for your help!