a1da4 / paper-survey

Summary of machine learning papers
32 stars 0 forks source link

Reading: HistBERT: A Pre-trained Language Model for Diachronic Lexical Semantic Analysis #231

Open a1da4 opened 2 years ago

a1da4 commented 2 years ago

0. Paper

1. What is it?

They try to answer these research questions:

2. What is amazing compared to previous works?

3. Where is the key to technologies and techniques?

スクリーンショット 2022-06-11 10 27 16

Training starts from the last checkpoint of pre-trained BERT model

4. How did evaluate it?

HistBERT models outperform the original BERT model.

5. Is there a discussion?

6. Which paper should read next?