huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
3.03k stars 627 forks source link

Unrecognized model in huawei-noah/TinyBERT_4L_zh #140

Open thesby opened 3 years ago

thesby commented 3 years ago

python: 3.7 transformers: 4.9.2 pytorch: 1.8.1

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("huawei-noah/TinyBERT_4L_zh")
model = AutoModel.from_pretrained("huawei-noah/TinyBERT_4L_zh")

ValueError: Unrecognized model in huawei-noah/TinyBERT_4L_zh. Should have a model_type key in its config.json, or contain one of the following strings in its name: visual_bert, canine, roformer, clip, bigbird_pegasus, deit, luke, detr, gpt_neo, big_bird, speech_to_text, vit, wav2vec2, m2m_100, convbert, led, blenderbot-small, retribert, ibert, mt5, t5, mobilebert, distilbert, albert, bert-generation, camembert, xlm-roberta, pegasus, marian, mbart, megatron-bert, mpnet, bart, blenderbot, reformer, longformer, roberta, deberta-v2, deberta, flaubert, fsmt, squeezebert, hubert, bert, openai-gpt, gpt2, transfo-xl, xlnet, xlm-prophetnet, prophetnet, xlm, ctrl, electra, encoder-decoder, funnel, lxmert, dpr, layoutlm, rag, tapas

zwjyyc commented 3 years ago

Hi, I think the reason is that the Transformers does not include the TinyBERT model. TinyBERT has a similar architecture toBERT, so you can use it in Transformers like BERT.

Chiang97912 commented 1 year ago

TInyBert and BERT have a similar architecture, so you can use the BertTokenizer and BertModel to solve this problem.

from transformers import BertTokenizer, AutoModel
tokenizer = BertTokenizer.from_pretrained("huawei-noah/TinyBERT_4L_zh")
model = BertModel.from_pretrained("huawei-noah/TinyBERT_4L_zh")