teslacool / SCA

Soft Contextual Data Augmentation
Other
39 stars 9 forks source link

Why do you want to train two language models? #18

Closed genbei closed 4 years ago

genbei commented 4 years ago

Don't you use p(x) instead of x? So I think that only the language model of the source language is trained, so what is the language model of the target side used for? Please answer, thank you very much.

genbei commented 4 years ago

I see, the target language model is used when decoding