nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization
Apache License 2.0
1.47k stars 423 forks source link

default batch_size is 3000, I don't quite understand, why so huge? #131

Closed yongzhuo closed 2 years ago

yongzhuo commented 2 years ago

default batch_size is 3000, I don't quite understand, why so huge?

beamind commented 2 years ago

The batch_size here is not the common sense batch size, here seems mean the token number in the batch. Check "batch" method in data_loader.py for detail.