gitabtion / SoftMaskedBert-PyTorch

🙈 An unofficial implementation of SoftMaskedBert based on huggingface/transformers.
MIT License
94 stars 17 forks source link

关于cor_label和det_labels的意义。 #8

Closed kovnew closed 3 years ago

kovnew commented 3 years ago

您好,想问一下,cor_labels和det_labels分别代表什么意思呢?

class BertForCsc(CscTrainingModel):
    def __init__(self, cfg, tokenizer):
        super().__init__(cfg)
        self.cfg = cfg
        self.bert = BertForMaskedLM.from_pretrained(cfg.MODEL.BERT_CKPT)
        self.tokenizer = tokenizer

    def forward(self, texts, cor_labels=None, det_labels=None):
        if cor_labels is not None:
            text_labels = self.tokenizer(cor_labels, padding=True, return_tensors='pt')['input_ids']
            text_labels = text_labels.to(self.cfg.MODEL.DEVICE)
            print('text labels: ', text_labels)
            # Tokens with indices set to -100 are ignored (masked)
            text_labels[text_labels == 0] = -100