microsoft / MASS

MASS: Masked Sequence to Sequence Pre-training for Language Generation
https://arxiv.org/pdf/1905.02450.pdf
Other
1.11k stars 206 forks source link

is it typo in README (Fine-tuning (CNN / Daily Mail))? #144

Closed SoyChae closed 4 years ago

SoyChae commented 4 years ago

Hi, first of all, thanks for your hard work.

In Pipeline for Fine-tuning (CNN / Daily Mail) you provided in README, it is written as --task translation_mass and --arch transformer_mass_base in shell file to fine-tune MASS-summarization.

I think --task might be masked_s2s and --arch be transformer_mass_base. Could you check this part?

Thanks!

StillKeepTry commented 4 years ago

No, masked_s2s is for pre-training and translation_mass is for fine-tuning.