Closed iver56 closed 8 years ago
Well, if you were doing regression for example, or looking for the best validation loss instead or accuracy, 'min'
would be applicable. So it's hard to say which one should be the default.
Perhaps it would be better to not use a default criterion and always have the user provide it?
Documentation would definitely help. I will add something later, but you're welcome to chip in too.
I think the above changes clarify behavior quite well. This issue can be closed now.
I tried to put this in the mnist example:
Training would stop after the first 3 epochs, with the message "Stopping because validation.Accuracy did not improve for 3 epochs."
That behavior didn't match my expectations. The program behaved correctly when I changed
criterion
to'max'
, though. Perhaps that should be the default value? Or is it more normal to watch the decreasing error rather than the increasing accuracy?Documentation with example usage of EarlyStopper would help