Open aspnetcs opened 2 years ago
do you change lr_init
? lr_init
is the initial learning rate, while CosineAnnealingLR
is learning rate scheduler
yes, LaTeX_OCR_PRO/configs/training.json This file seems to be useless
--
At 2022-02-28 10:40:42, "兮尘" @.***> wrote:
do you change lr_init? lr_init is the initial learning rate, while CosineAnnealingLR is learning rate scheduler
— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>
please refer to model/utils/lr_schedule.py
, which defines object LRSchedule
.
Warming-up (lr_warm
, end_warm
) and decay (start_decay
, end_decay
) are taken to schedule the learning rate.
the learning rate will be lr_init
only when the epoch has end_warm < epoch < start_decay
I feel particularly complicated, how to use pytorch's lr_scheduler.MultiplicativeLR for training? thanks!!!
--
At 2022-03-01 15:31:00, "兮尘" @.***> wrote:
please refer to model/utils/lr_schedule.py, which defines object LRSchedule. Warming-up (lr_warm, end_warm) and decay (start_decay, end_decay) are taken to schedule the learning rate. the learning rate will be lr_init only when the epoch has end_warm < epoch < start_decay
— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>
you can refer to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html for pytorch's MultiplicativeLR.
emmm... it seems you are trying to reproduce with pytorch? lr_scheduler is only a trick to improve performance. you can try any LR on your own. Or you can even give up using LR and use a fixed learning rate.
configs/training.json is useless, changing the learning rate in it does not work at all, the learning rate I have been using is CosineAnnealingLR