nodefluxio / vortex

A Deep Learning Model Development Framework for Computer Vision
27 stars 6 forks source link

[BUG] lr scheduler not changed when resume and args is changed #70

Closed triwahyuu closed 4 years ago

triwahyuu commented 4 years ago

Describe the bug When resume training and the scheduler args is changed, the scheduler behavior is not changed.

To Reproduce

  1. train model with this lr_scheduler config:
    lr_scheduler: {
    method: StepLR,
    args: {step_size: 200}
    }
  2. resume training with this config:
    lr_scheduler: {
    method: StepLR,
    args: {step_size: 350}
    }
  3. the learning rate still stepped when hit epoch 200.

Additional context this happened because the lr scheduler load the state dict from the resumed model, should be changed.