Closed lvjiujin closed 2 years ago
Hello, I want to ask you what's your pretrained model, bert-base-chinese or bert-wwm? chinese-roberta-wwm? I read your paper, you said bert-base, So I want to know your specific pretrained model.
bert-base-chinese. I provide the download link of the pretrained model.
Thank you very much. very good!
Hello, I want to ask you what's your pretrained model, bert-base-chinese or bert-wwm? chinese-roberta-wwm? I read your paper, you said bert-base, So I want to know your specific pretrained model.