lxk00 / BERT-EMD

50 stars 13 forks source link

General Distill Stage? #9

Open lsyysl9711 opened 2 years ago

lsyysl9711 commented 2 years ago

Hello! The work is great and thanks for sharing the codes!

But I am confused about the general distill stage: In the Bert-emd folder, I see a file called "general_distill" but it seems that the file is not used and also can you tell me what is the pregenerate-dataset in general_distill.py?

Than you very much!

lxk00 commented 2 years ago

This work draws on TinyBERT and we reuse some of the code. However, we did not perform general distillation and directly use TinyBERT released model. For General Distill Stage, you can refer to https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/TinyBERT.