liuwei1206 / LEBERT

Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
336 stars 60 forks source link

Some weights of the model checkpoint at ../berts/bert/pytorch_model.bin were not used when initializing WCBertCRFForTokenClassification #37

Closed bultiful closed 2 years ago

bultiful commented 2 years ago

hi, some weights not used when training. see the picture

Some weights of the model checkpoint at ../berts/bert/pytorch_model.bin were not used when initializing WCBertCRFForTokenClassification

38553b57378025d180c10715f6670c9

liuwei1206 commented 2 years ago

Hi,

This is normal. The BERT checkpoint is the parameters of the original BERT trained by masked language model task and next sentence prediction task. For downstream tasks, we only use the encoder of BERT and do not need the layers for mask word prediction and next sentence prediction. Consequently, some weights will not be used for downstream tasks.

Best

bultiful commented 2 years ago

thx u