danpovey / pocolm

Small language toolkit for creation, interpolation and pruning of ARPA language models
Other
90 stars 48 forks source link

--warm-start-ratio 1 #71

Closed danpovey closed 7 years ago

danpovey commented 7 years ago

this is an issue in train_lm.py. For a small data set I asked someone to test the --warm-start-ratio to 1, and the script did not work. Can someone please implement something whereby if that's set to 1, the warm-start optimization is skipped?

Dan

danpovey commented 7 years ago

Forgot that I had already implemented this but not pushed it... closing.