neuralmind-ai / portuguese-bert

Portuguese pre-trained BERT models
Other
792 stars 122 forks source link

Loading model on a BertForSequenceClassification class #19

Closed arthurmarcal closed 4 years ago

arthurmarcal commented 4 years ago

transformers==3.0.2 on Colab enviroment

When I load the model into the BertForSequenceClassification class.

model = BertForSequenceClassification.from_pretrained('neuralmind/bert-base-portuguese-cased',
                                                      num_labels=2,
                                                      output_attentions=False,
                                                      output_hidden_states=False)

I got the following warning msg: Some weights of the model checkpoint at neuralmind/bert-base-portuguese-cased were not used when initializing BertForSequenceClassification

Some weights of the model checkpoint at neuralmind/bert-base-portuguese-cased were not used when initializing BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias']

- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPretraining model).

- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

Some weights of BertForSequenceClassification were not initialized from the model checkpoint at neuralmind/bert-base-portuguese-cased and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

1) Is it OK to use this model for classification tasks? 2) I mean, can I benefit from the pre-trained weights?

fabiocapsouza commented 4 years ago

Hi @arthurmarcal ,

1) Yes. The first warning is expected because the checkpoint is of a BertForPreTraining model, which has classification heads for the pretraining tasks (cls module). The second warning (Some weights of BertForSequenceClassification were not initialized from the model checkpoint...) is also expected: classifier is the new classification layer that will be trained on your task.

2) Yes, because this way you only have to learn a few thousand parameters for your task (weights of classifier module). Otherwise you will have to train all 110M~330M parameters from scratch on your classification dataset.

arthurmarcal commented 4 years ago

Cool! Thanks Fabio.

dezoito commented 3 years ago

@arthurmarcal , were you able to get this running? If so, did it improve your results or processing time?