victoresque / pytorch-template

PyTorch deep learning projects made easy.
MIT License
4.75k stars 1.09k forks source link

Maybe a bug in lr_scheduler when resuming #68

Open Outliers1106 opened 4 years ago

Outliers1106 commented 4 years ago

In trainer.py, at line 71 self.lr_scheduler.step() should be replaced by self.lr_scheduler.step(epoch) the source code of the function step is

def step(self, epoch=None):
        if epoch is None:
            epoch = self.last_epoch + 1
        self.last_epoch = epoch
        for param_group, lr in zip(self.optimizer.param_groups, self.get_lr()):
            param_group['lr'] = lr

Because when resuming from a checkpoint, the lr_scheduler will init from scratch, the default value of parameterlast_epoch in lr_scheduler will be -1 rather than the last epoch in checkpoint.