Closed anwai98 closed 3 days ago
I will think about how this interacts with continuing training from an existing checkpoint.
Hi @constantinpape,
I took care of interactions between continuing training with overwriting trained models (i.e. raising error when overwrite_training
is set to False
and user passes a custom checkpoint to continue training)
GTG from my side!
This PR adds optional support to the
DefaultTrainer
in favor of avoiding to train if the training is already completed. This is controlled via thefit
method with theoverwrite_training
argument, which is set toTrue
as default (i.e. the current action of the trainers). If this is set toFalse
, it will check in thelatest.pt
checkpoint to verify if the training is finished or not. GTG from my side!