Closed chantera closed 1 year ago
As reported in #166, the current studio-ousia/mluke-large-lite-finetuned-conll-2003
model was fine-tuned using erroneous entity_attention_mask
.
I suggest updating the model in Hugging Face after merging this PR.
Thank you for fixing this problem! The PR looks good to me.
We will try fine-tuning models with the correct padding and update the model with a better one!
Related to #166, this PR fixes
entity_attention_mask
forexamples/ner/evaluate_transformers_checkpoint.py
.