microsoft / MASS

MASS: Masked Sequence to Sequence Pre-training for Language Generation
https://arxiv.org/pdf/1905.02450.pdf
Other
1.11k stars 206 forks source link

Question towards the Pre-trained weight for the Neural Machine Translation under supNMT #168

Open MichaelCaohn opened 3 years ago

MichaelCaohn commented 3 years ago

Thank you so much for the great work and for making it public.

I have read the readme instructions for Neural Machine Translation under supNMT. image I am wondering for this En-Zh pretrained model weight, is it the weight just for En-Zh translation task, or it's a more general weight, which can be used for other translation tasks by further fine-tuning on tasks like En-Fr?