Closed saurabhvyas closed 6 years ago
It turns out, I just need to change batch size to something suited for my i5 CPU, like batch size =2 works fine, thanks to @martinpopel . you can add following argument when running from terminal --hparams="batch_size=123
@saurabhvyas How does this work? Can I set multiple hparams like that and overwrite some from an hparam_set
?
Something like
t2t-trainer \
--hparams="learning_rate=0.1337,learning_rate_decay_schem=rsqrt_decay"
# ..
Yes, you can use multiple hparams on the command line and override hparam_set
this way.
Nvm, I used the wrong setting.
@martinpopel Thanks - I'm asking because I can't seem to be able to set the following:
--hparams='learning_rate=0.15,learning_rate_decay_scheme=exp_decay,learning_rate_schedule=exp_decay'
For the transformer model. For some reason the learning_rate starts with a value of 1 and remains constant over training steps. :/
Python version 3.5 TF : 1.5