atulkum / pointer_summarizer

pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Apache License 2.0
904 stars 243 forks source link

Need help for retraining and cross validation #18

Open atulkum opened 5 years ago

atulkum commented 5 years ago

Need help for retraining and cross validation and see if the ROUGE score matches exactly (or better) with the numbers reported in the paper.
I just train for 500k iteration (with batch size 8) with pointer generation enabled + coverage loss disabled and next 100k iteration (with batch size 8) with pointer generation enabled + coverage loss enabled.

It would be great if someone can help re-running these experiments and try to see if we can improve the result and match it with the paper.

You might need a better GPU though. (my current one is gtx 1070 8 gb)

pengzhi123 commented 5 years ago

i very want to help you, but i only have a 1080ti 12g and don't know how change code to get BLEU score. sorry.

atulkum commented 5 years ago

You can compare the rouge score too. I used 1070 with 8 gb and it took 3 days to train for 500k iteration. On 1080 ti it must be faster.

pengzhi123 commented 5 years ago

You can compare the rouge score too. I used 1070 with 8 gb and it took 3 days to train for 500k iteration. On 1080 ti it must be faster.

I have finished the test of 100K and am now doing another test of 500K.

atulkum commented 5 years ago

Thats great. One more option would be to train for 700k make checkpoint every 50k and verify which checkpoint give best result.

pengzhi123 commented 5 years ago

Thats great. One more option would be to train for 700k make checkpoint every 50k and verify which checkpoint give best result.

I've done 288K/500K, and I will start the 700k test in 2 days. So I'm going to upload it to DropBox, or you can select one to me.

atulkum commented 5 years ago

You don't need to upload the model. You can just report the rouge score.

pengzhi123 commented 5 years ago

You don't need to upload the model. You can just report the rouge score.

ok.

shivam13juna commented 5 years ago

@atulkum did you try this model on some external data? like how do you convert just a csv file of text data to bin format. And could you upload pretrianed weight as well?? @pengzhi123

pengzhi123 commented 5 years ago

I'm sorry for uploading data now. Our machine is broken, and I only trained to 660K. The following is the experimental result: 100k (batch size 8): ROUGE-1: rouge_1_f_score: 0.3420 with confidence interval (0.3397, 0.3443) rouge_1_recall: 0.3830 with confidence interval (0.3803, 0.3856) rouge_1_precision: 0.3288 with confidence interval (0.3263, 0.3312)

ROUGE-2: rouge_2_f_score: 0.1401 with confidence interval (0.1382, 0.1420) rouge_2_recall: 0.1568 with confidence interval (0.1545, 0.1590) rouge_2_precision: 0.1350 with confidence interval (0.1331, 0.1369)

ROUGE-l: rouge_l_f_score: 0.3105 with confidence interval (0.3083, 0.3126) rouge_l_recall: 0.3475 with confidence interval (0.3448, 0.3500) rouge_l_precision: 0.2987 with confidence interval (0.2964, 0.3010)

500k (batch size 8): rouge_1_f_score: 0.3603 with confidence interval (0.3580, 0.3624) rouge_1_recall: 0.4006 with confidence interval (0.3980, 0.4032) rouge_1_precision: 0.3475 with confidence interval (0.3449, 0.3500)

ROUGE-2: rouge_2_f_score: 0.1538 with confidence interval (0.1515, 0.1560) rouge_2_recall: 0.1703 with confidence interval (0.1679, 0.1727) rouge_2_precision: 0.1492 with confidence interval (0.1469, 0.1514)

ROUGE-l: rouge_l_f_score: 0.3292 with confidence interval (0.3270, 0.3313) rouge_l_recall: 0.3659 with confidence interval (0.3633, 0.3684) rouge_l_precision: 0.3177 with confidence interval (0.3153, 0.3202)

atulkum commented 5 years ago

Thanks for doing this. Did you enabled coverage loss for this result?

pengzhi123 commented 5 years ago

https://www.dropbox.com/s/czed99yiqjo34f3/training_log?dl=0

liruowei0919 commented 3 years ago

@pengzhi123 Hi there, Can I ask what machine u r running it on? Seems really fast