microsoft / CodeXGLUE

CodeXGLUE
MIT License
1.5k stars 363 forks source link

fix: Solving the problems of using Bert and DistilBert for classifica… #158

Closed edwardqin-creator closed 1 year ago

edwardqin-creator commented 1 year ago

…tion tasks on the given datasets.

If we don't change run.py and instead directly change the --model_type from RoBERTa to BERT or DistilBERT, there will be dimension errors in the classification tasks.

After addressing the dimension errors, the code seems to run smoothly. However, the evaluation process is interrupted when computing the eval_loss and eval_acc.

It was later discovered that we mistakenly used the wrong model for Bert and DistilBert. BertForMaskedLM is intended for language modeling tasks, while BertForSequenceClassification is designed for text classification. To classify input sentences, we should use BertForSequenceClassification and DistilBertForSequenceClassification.

log: Solving the problems of using Bert and DistilBert for classification tasks on the given datasets.