senarvi / theanolm

TheanoLM is a recurrent neural network language modeling tool implemented using Theano
Apache License 2.0
81 stars 29 forks source link

Getting models from each epoch #22

Closed bplank closed 7 years ago

bplank commented 7 years ago

Hi @senarvi great work with theanolm, thanks. I was wondering, is there an option to store the models of every epoch? (for now it overwrites MODELNAME each time) Thanks! Barbara

senarvi commented 7 years ago

There's no such option. Why would you want to do that?

bplank commented 7 years ago

Because I'm using the LM to score for another task (where it's not ppl on the dev set).

bplank commented 7 years ago

But I see that other's don't have any advantage for such an option, thus closing it. Thanks!

senarvi commented 7 years ago

There are probably not so many people who need this option, but for a quick experiment, maybe you could just copy the file to a new name after each epoch, or each time the model is saved here:

https://github.com/senarvi/theanolm/blob/master/theanolm/training/trainer.py#L455

bplank commented 7 years ago

Excellent, thanks! I'll do that.

Another small note: Running on a GPU works fine, but in order to get the code running on a CPU I had to disable the exception block in bin/theanolm that regards the following import:

from theano.gpuarray.type import ContextNotDefined

thanks!

On Thu, Jan 19, 2017 at 1:35 PM, Seppo Enarvi notifications@github.com wrote:

There are probably not so many people who need this option, but for a quick experiment, maybe you could just copy the file to a new name after each epoch, or each time the model is saved here:

https://github.com/senarvi/theanolm/blob/master/theanolm/ training/trainer.py#L455

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/senarvi/theanolm/issues/22#issuecomment-273765935, or mute the thread https://github.com/notifications/unsubscribe-auth/AAbMlzgEl5x60hSqK_WKeCa2HcQLQTy6ks5rT1iogaJpZM4Ln7T2 .