huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
3.03k stars 627 forks source link

TinyBERT for masked LM #198

Open danchern97 opened 2 years ago

danchern97 commented 2 years ago

Hi! I've been trying to measure MLM perplexity for TinyBERT model (in particular, tinybert6l), and I keep getting inconsistent results. Looks like the MLM head for TinyBERT is not loaded properly when loading from AutoModelForMaskedLM or by BertForMaskedLM.