nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization
Apache License 2.0
1.47k stars 424 forks source link

Extractive Setting? #127

Open acc-galenicum opened 2 years ago

acc-galenicum commented 2 years ago

Hi, This might be a dumb question but I am not getting it.

This model is supposed to perform an extractive summarization process. But when I look at the raw data (cnn_stories), they provide a text with some highlights at the end (I assume this is the summary), but the problem is this highlights do not belong to the original text, so I don't understand the raw data.

To put a specific example I attach a story file. 00a308681faf9c82a0e62a89b21fcdefb84b88fa.txt

Anyone can help me out with this? Thanks in advance

acc-galenicum commented 2 years ago

Ok, self-response in case anyone wonders: I missed the part of the paper where it explains it. The extractive summary is created based on the highlights or the abstractive summary selecting the sentences from the text which maximize the ROUGE metric.

roronoazoro29 commented 2 years ago

Excuse me, may I ask a question?

  1. Is it possible to use other dataset that are truly extractive summaries ( take few sentence from the original text) such as dataset BBC news https://www.kaggle.com/datasets/pariza/bbc-news-summary
  2. how to create .pt as bert_data and .story if using other dataset ( in this repo, already exist and just download) ? thank you