Closed Beastmaster closed 4 years ago
the fastai optimizer wraps the original optimizer. There are some other class method to allow you to update lr in this way.
There are 2 ways to build a optimizer: build_one_cycle_optimizer
and build_optimizer
. Optimizer built by "one_cycle_optimizer" is warped by fastai optimizer.
Any further explaination for this question?
in file: det3d/solver/learning_schedules_fastai.py: 61
self.optimizer.lr = lrs[-1]
will not change lr used in optimizer. I think the proper way of updating lr is :self.optimizer.param_groups[0]['lr'] = lrs[-1]