Closed Daniellli closed 1 year ago
Hi,
Once the training accuracy starts to stagnate, you can decrease the learning rate. While experimenting, we usually used to set this parameter to a very high value to ensure we don't automatically decrease lr, and just manually decrease the learning when accuracy stopped improving. Once we knew when to decrease learning rates, we just set lr_decay_epochs
parameters for convinience to automatically decrease lr. I don't believe there is a better way to know it apriori on when to decrease the learning rates.
(In det setup, we decrease lr at epochs 25 and 27 for SR3D, epochs 80 and 90 for NR3D and at epoch 65 for Scanrefer.)
appreciate for your help and such novel work.
hi,
i meet some problem concerning to the hyperparameter 'lr_decay_epochs'. Could you share the method of deciding such hyperparameter.
appreciate for your help.