jcjohnson / torch-rnn

Efficient, reusable RNNs and LSTMs for torch
MIT License
2.5k stars 508 forks source link

Automatically stop when improvement slows? #171

Open ctnlaring opened 7 years ago

ctnlaring commented 7 years ago

I'd like an option to stop training when it's plateaued.

When loss stays effectively the same for X amount of time/cycles

dgcrouse commented 7 years ago

Loss can plateau then decrease again as learning rate varies, this is very common in deep learning. I'm not sure this will be a useful feature to add because of this.