Thanks for your good work with this package. As i am currently trying to traverse potential usage within crossvalidation, i would be interested to optimze the number of maxepochs for the learning. Putting a general max epoch after some first tests might be a less favourable solution, than optimizing it over the folds. Is it somehow possible to enable early stopping within the learner? On my data (currently a artifical classification problem) the autoencoder already converges quite early (~ 80 epochs) and therefore i fear overfitting when taking a learner after 300 epochs or similar.
Is it possible to somehow optimize the epoch finding within this package or do you have an suggestion how to adress this?
Dear Lasse,
Thanks for your good work with this package. As i am currently trying to traverse potential usage within crossvalidation, i would be interested to optimze the number of maxepochs for the learning. Putting a general max epoch after some first tests might be a less favourable solution, than optimizing it over the folds. Is it somehow possible to enable early stopping within the learner? On my data (currently a artifical classification problem) the autoencoder already converges quite early (~ 80 epochs) and therefore i fear overfitting when taking a learner after 300 epochs or similar.
Is it possible to somehow optimize the epoch finding within this package or do you have an suggestion how to adress this?
Best regards, Frank