toru34 / li_emnlp_2017

Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
61 stars 16 forks source link

Great work! #4

Open momoz44 opened 6 years ago

momoz44 commented 6 years ago

Hi, I'm currently doing some summarization research. And quiet impressed by your work. I'm trying to train the model myself with the same data set, which is over 3 millions samples... Just wondering how long it take you to have the pretrained model done?

toru34 commented 6 years ago

Thanks for using my code.

It took a week with GeForce 1060.

The reason why it took so long time is probably that some functions in the code do not support the autobaching (https://github.com/clab/dynet/issues/1082, https://github.com/clab/dynet/issues/1059).

You can speed up your code by writing manual batch version. I may add this in the future.

On Fri, Jan 26, 2018 at 9:59 PM, momoz44 notifications@github.com wrote:

Hi, I'm currently doing some summarization research. And quiet impressed by your work. I'm trying to train the model myself with the same data set, which is over 3 millions samples... Just wondering how long it take you to have the pretrained model done?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/toru34/li_emnlp_2017/issues/4, or mute the thread https://github.com/notifications/unsubscribe-auth/ALQHYvWL_2oC83t6Pzufe7UewV0NoSNHks5tOcw8gaJpZM4RuROz .