kyzhouhzau / BERT-NER

Use Google's BERT for named entity recognition (CoNLL-2003 as the dataset).
MIT License
1.23k stars 335 forks source link

How to reproduce your results. #14

Closed ljch2018 closed 5 years ago

ljch2018 commented 5 years ago

I use the same run command like yours, but I get worse results on dev dataset.

eval_f = 0.89656204
eval_precision = 0.90508
eval_recall = 0.88843685
global_step = 653
loss = 17.190592

I use "BERT-Base, Multilingual Cased: 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters" as checkpoint, which is public by google at November 23rd, 2018.

ljch2018 commented 5 years ago

When I use "BERT-Base, Cased: 12-layer, 768-hidden, 12-heads , 110M parameters" as checkpoint, F1 reachs 0.93 too.