studio-ousia / luke

LUKE -- Language Understanding with Knowledge-based Embeddings
Apache License 2.0
705 stars 102 forks source link

Fix entity_attention_mask for NER. #172

Closed chantera closed 1 year ago

chantera commented 1 year ago

Related to #166, this PR fixes entity_attention_mask for examples/ner/evaluate_transformers_checkpoint.py.

chantera commented 1 year ago

As reported in #166, the current studio-ousia/mluke-large-lite-finetuned-conll-2003 model was fine-tuned using erroneous entity_attention_mask. I suggest updating the model in Hugging Face after merging this PR.

ryokan0123 commented 1 year ago

Thank you for fixing this problem! The PR looks good to me.

We will try fine-tuning models with the correct padding and update the model with a better one!