nlpyang / PreSumm

code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
MIT License
1.29k stars 465 forks source link

About .stories and .train.bert.pt #221

Open LiuChen19960902 opened 3 years ago

LiuChen19960902 commented 3 years ago

hello, I downloaded CNN and Dailymail stories, and followed README.md 1-5 steps. I found that CNN has 90k+ stories and Dailymail has 200k+ stories, but generized 140+ .train.bert.pt?so amazing.