when I load the model file, I get the following problem which I don't understand:
Some weights of the model checkpoint at /home/tu/device/backup/wjjia/DRhard/adore-star were not used when initializing RobertaDot: ['classifier.dense.weight', 'classifier.dense.bias', 'classifier.out_proj.weight', 'classifier.out_proj.bias', 'roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']
This IS expected if you are initializing RobertaDot from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
This IS NOT expected if you are initializing RobertaDot from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of RobertaDot were not initialized from the model checkpoint at /home/tu/device/backup/wjjia/DRhard/adore-star and are newly initialized: ['roberta.embeddings.position_ids']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
when I load the model file, I get the following problem which I don't understand: Some weights of the model checkpoint at /home/tu/device/backup/wjjia/DRhard/adore-star were not used when initializing RobertaDot: ['classifier.dense.weight', 'classifier.dense.bias', 'classifier.out_proj.weight', 'classifier.out_proj.bias', 'roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']