mickeysjm / R-BERT

Pytorch re-implementation of R-BERT model
GNU General Public License v3.0
66 stars 15 forks source link

Bert base eval f1 score stuck at 0.85 #6

Open kemalaraz opened 4 years ago

kemalaraz commented 4 years ago

Hello there,

I am using bert-base-uncased and haven't changed the config.ini (just commented out bert-large and using bert-base instead) but in your readme the performance for semeval is 0.88 but I stuck at 0.85 what might be the problem?

EmanuelaBoros commented 3 years ago

Hello there,

I am using bert-base-uncased and haven't changed the config.ini (just commented out bert-large and using bert-base instead) but in your readme the performance for semeval is 0.88 but I stuck at 0.85 what might be the problem?

Hello @kemalaraz, There is a chance that you forget to apply the official script for SemEval 2010 task-8. After you finished running, apply the script on the final results on test:

$ cd eval
$ bash test.sh
$ cat res.txt