nlpyang / PreSumm

code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
MIT License
1.29k stars 465 forks source link

How to train TransformerExt baseline? #224

Open GongShuai8210 opened 3 years ago

GongShuai8210 commented 3 years ago

How to train TransformerExt baseline? I just change encoder with 'baseline', the Rouge scores are higher than the article decribes.I doubt that I still use Bert ,but i am not sure how to use Transformer correctly.