huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
3.01k stars 627 forks source link

multilingual TinyBERT #190

Open YegorKhodak opened 2 years ago

YegorKhodak commented 2 years ago

1) What should I do if I want to apply TinyBERT for languages like French, Spanish, German, etc.? There are two steps for training TinyBERT - the first is general distillation, and the second one is task-specific distillation. 2) How can I find pre-trained TinyBERT weights for the languages above? (I mean weights after general distillation) 3) Is there a multilingual TinyBERT model? 4) How long would it take to complete general distillation on another language, assuming that I have a pre-trained BERTbase model in that language?

ernado-x commented 2 years ago

+1