lonePatient / BERT-NER-Pytorch

Chinese NER(Named Entity Recognition) using BERT(Softmax, CRF, Span)
MIT License
2.08k stars 427 forks source link

self.init_weights()是重新初始化了bert的权重吗? #37

Open renmada opened 3 years ago

renmada commented 3 years ago

如果是的话,为什么要这么做

rmxkyz commented 3 years ago

你好,根據我的了解該函氏的確初始化了權重,同時也能根據config中pruned_heads參數去裁剪,最後根據讀取的from_pretrained 中的model權重覆寫上去。 官方文檔: https://huggingface.co/transformers/_modules/transformers/modeling_bert.html

lvjiujin commented 3 years ago

Here, please refer the link Why we need the init_weight function in BERT pretrained model