Open ctnlaring opened 7 years ago
I'd like an option to stop training when it's plateaued.
When loss stays effectively the same for X amount of time/cycles
Loss can plateau then decrease again as learning rate varies, this is very common in deep learning. I'm not sure this will be a useful feature to add because of this.
I'd like an option to stop training when it's plateaued.
When loss stays effectively the same for X amount of time/cycles