nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization
Apache License 2.0
1.47k stars 423 forks source link

Hi i wonder if i want to do a multi-classification task, what should i change? #117

Closed Anothernewcomer closed 2 years ago

Anothernewcomer commented 2 years ago

I changed the loss function、sigmoid、outputsize of the linear layer, and the data format of course, but it seems that some problems with the data dimension keep occuring like this :

Traceback (most recent call last): File "train.py", line 352, in <module> train(args, device_id) File "train.py", line 285, in train trainer.train(train_iter_fct, args.train_steps) File "E:\python\BertSum-master (2)\BertSum-master\src\models\trainer.py", line 120, in train report_stats) File "E:\python\BertSum-master (2)\BertSum-master\src\models\trainer.py", line 304, in _gradient_accumulation loss = self.loss(sent_scores, labels.float()) File "D:\Anaconda\lib\site-packages\torch\nn\modules\module.py", line 550, in __call__ result = self.forward(*input, **kwargs) File "D:\Anaconda\lib\site-packages\torch\nn\modules\loss.py", line 932, in forward ignore_index=self.ignore_index, reduction=self.reduction) File "D:\Anaconda\lib\site-packages\torch\nn\functional.py", line 2317, in cross_entropy return nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction) File "D:\Anaconda\lib\site-packages\torch\nn\functional.py", line 2113, in nll_loss .format(input.size(0), target.size(0))) ValueError: Expected input batch_size (5) to match target batch_size (11). this one occured before computing loss function.I print the shape of sent_scores and labels before computing loss , the result is as follows: sent_scores shape: torch.Size([5, 11, 18]) labels shape: torch.Size([11, 18])

Whatelse, I realize the first two dimensions are the size of the first two batchs( in this case, 11 and 18 respectively) I wonder why this happended? Is there anything else I missed to change in order to do a multi-classification task ? Thanks for reply !