richardbaihe / paperreading

NLP papers
MIT License
2 stars 0 forks source link

ACL 2020 | Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models #38

Closed richardbaihe closed 4 years ago

richardbaihe commented 4 years ago
image
richardbaihe commented 4 years ago

This paper focuses on the model pre-training for discourse tasks. Different with NSP/SOP, this paper proposes a new sub-task: predicting whether the distance between two sentences are within K window length. The best K according to its experiments is 2.
Experimental results show that their model achieves SOTA on various Discourse Tasks.