timoschick / pet

This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"
https://arxiv.org/abs/2001.07676
Apache License 2.0
1.62k stars 285 forks source link

There is no softmax #105

Open bookpen opened 11 months ago

bookpen commented 11 months ago

I find there is no softmax function when I should get the distribution of prediction.


wrapper.py def mlm_train_step(self, labeled_batch: Dict[str, torch.Tensor], unlabeled_batch: Optional[Dict[str, torch.Tensor]] = None, lmtraining: bool = False, alpha: float = 0, **) -> torch.Tensor: """Perform a MLM training step."""

inputs = self.generate_default_inputs(labeled_batch)
mlm_labels, labels = labeled_batch['mlm_labels'], labeled_batch['labels']

outputs = self.model(**inputs)
prediction_scores = self.preprocessor.pvp.convert_mlm_logits_to_cls_logits(mlm_labels, outputs[0])
loss = nn.CrossEntropyLoss()(prediction_scores.view(-1, len(self.config.label_list)), labels.view(-1))

the prediction_scores is not applied to the softmax