yandex / faster-rnnlm

Faster Recurrent Neural Network Language Modeling Toolkit with Noise Contrastive Estimation and Hierarchical Softmax
Other
562 stars 138 forks source link

Training time for one billion word benchmark #10

Closed tokestermw closed 8 years ago

tokestermw commented 8 years ago

Hi, thanks for your RNN work. Very useful so far.

We are thinking of training the model for the one billion word dataset which comes to about 35 million sentences. I was wondering how long the training took for the configuration shown in the docs. It would be great to have a ballpark figure (a week? month? couple days?) before we embark on training on a motherlode of data.

Thanks!

tokestermw commented 8 years ago

Actually I just noticed you mentioned "15 millions of words per minute". I'll see if I could get somewhere close.