mickeysjm / R-BERT

Pytorch re-implementation of R-BERT model
GNU General Public License v3.0
66 stars 15 forks source link

Original Paper #3

Closed qiunlp closed 4 years ago

qiunlp commented 4 years ago

你的模型最后结果是 90.16, Original Paper 结果89.25, 这里的Original Paper 是什么? 89.25是怎么来的? 谢谢!

mickeysjm commented 4 years ago

The original paper is https://arxiv.org/pdf/1905.08284.pdf

I directly copy the result from the original paper, which gives us 89.25.

Thanks.

tyistyler commented 4 years ago

EXcuse me, original paper mentioned is "For the pre-trained BERT model, we use the uncased basic model"