Describe the bug
Pytorch original scheduler have different application of scheduler step (some of them in each batch update, the others only called once per epoch ). However in default trainer, it always called per batch update
Expected behavior
Every scheduler must apply the correct behavior of when to call the scheduler step()
Describe the bug Pytorch original scheduler have different application of scheduler step (some of them in each batch update, the others only called once per epoch ). However in default trainer, it always called per batch update
Expected behavior Every scheduler must apply the correct behavior of when to call the scheduler step()