dmmiller612 / bert-extractive-summarizer

Easy to use extractive text summarization with BERT
MIT License
1.39k stars 307 forks source link

Weights not used when initializating BertModel #118

Open LeoDalcegio opened 2 years ago

LeoDalcegio commented 2 years ago

I am recieving the following warning, is this something normal, that should be happening or I am doing something wrong?

The version of the transformers module and other dependencies are the same as in requirements.txt.

I just want to know if I should be worried about this warning.

Some weights of the model checkpoint at bert-large-uncased were not used when initializing BertModel: ['cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.decoder.weight', 'cls.seq_relationship.bias']

koolgax99 commented 1 year ago

Hey @LeoDalcegio Is this issue solved for you? I am getting the same error.

cc @dmmiller612 kindly look into this issue