NLPScott / bert-Chinese-classification-task

bert中文分类实践
735 stars 224 forks source link

预训练模型 #5

Open NiceMartin opened 5 years ago

NiceMartin commented 5 years ago

这里的BERT预训练模型是怎样得到的? 或是 直接用BERT做分类任务,没有根据Masked LM和Next sentence 预训练?

NLPScott commented 5 years ago

直接加载https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip bert中文模型,经过该模型的transformer encoder过程,然后在计算损失的地方做了调整,此处使用交叉熵,以此来fine-tuning

wutonghua commented 5 years ago

model.bert.load_state_dict(torch.load(args.init_checkpoint, map_location='cpu')) RuntimeError: Error(s) in loading state_dict for BertModel: Missing key(s) in state_dict: 有什么解决办法吗?谢谢!

FakerYFX commented 5 years ago

您好,遇到了类似的问题,请问您解决了吗? @wutonghua