liuwei1206 / LEBERT

Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
338 stars 60 forks source link

What's your pretrained model? #44

Closed lvjiujin closed 2 years ago

lvjiujin commented 2 years ago

Hello, I want to ask you what's your pretrained model, bert-base-chinese or bert-wwm? chinese-roberta-wwm? I read your paper, you said bert-base, So I want to know your specific pretrained model.

liuwei1206 commented 2 years ago

bert-base-chinese. I provide the download link of the pretrained model.

lvjiujin commented 2 years ago

bert-base-chinese. I provide the download link of the pretrained model.

Thank you very much. very good!