playma / LCSTS2.0-clean

7 stars 1 forks source link

some question about LCSTS #1

Open sys1874 opened 6 years ago

sys1874 commented 6 years ago

@playma hello playma, i am doing some research about LCSTS , and it take me a lot of time in training the model with this dataset in single TiTAN xp GPU. Could you tell me how much time you use to train your model,and what kind of GPU you use for it? TX

playma commented 6 years ago

I use GTX 1080Ti to train the model. if you want to train a classical Seq2Seq, it takes about 1 day. The framework I used is OpenNMT-py.

We proposed a hybrid word-character model for Chinese abstractive summarization, you can see the arxiv paper.

We will update the research result nearly, because the paper is in submitting.

The best ROUGE score on LCSTS 1.0
R-1: 62.26 // R-2: 52.49 // R-L: 60.00

The best ROUGE score on LCSTS 2.0-clean R-1: 44.38 // R-2: 32.26 // R-L: 41.35