hengluchang / deep-news-summarization

News summarization using sequence to sequence model with attention in TensorFlow.
MIT License
185 stars 61 forks source link
deep-learning encoder-decoder gigaword lstm news-summarization recurrent-neural-networks seq2seq tensorflow tensorflow-models

Codacy Badge

News summarization

News summarization using sequence to sequence model in TensorFlow.

Introduction

This repository is a demonstration of abstractive summarization of news article exploiting TensorFlow sequence to sequence model. This model incorporates attention mechanism and uses LSTM cell as both encoder and decoder.

This model is trained on one million Associated Press Worldstream news stories from English Gigaword second edition. The examples below are based on the model trained on AWS EC2 g2.2xlarge instance for 10 epochs, which took around 20 hours.

For more detailed information, please see our project research paper: Headline Generation Using Recurrent Neural Network.

Examples

News 1

News: A roadside bomb killed five people Thursday near a shelter used as a police recruiting center in northeast Baghdad, police said.

Actual headline: Iraqi police: Bomb kills 5 near police recruiting center in northeast Baghdad

Predicted headline: URGENT Explosion kills five people in Baghdad

News 2

News: The euro hit a record high against the dollar Monday in Asia as concerns over the U.S. subprime mortgage crisis remain a heavy weight on the greenback.

Actual headline: Euro hits record high versus dollar in Asian trading

Predicted headline: Euro hits record high against dollar

How to run

For demonstration, we use the sample file (a very small portion of English Gigaword) from LDC as our dataset to train our model. If you want to reproduce the results like the above examples, larger training set is necessary. You can download the trained model parameters which was trained on a larger portion on Gigaword by following the instructions in the Download vocabs and trained model parameters section below. The whole English Gigaword can be obtained from university libraries.

Pre-req

$ git clone https://github.com/hengluchang/deep-news-summarization.git
$ cd deep-news-summarization
$ mkdir -p working_dir output

Download vocabs and trained model parameters

$ python download_vocabs_and_trained_params.py ./working_dir

Train your own summarizer

$ python split_data.py
$ python execute.py

Testing

$ python execute.py
$ python evaluation.py

Interactive testing

$ python execute.py

References

Research Paper References