YyzHarry / imbalanced-semi-self

[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
https://arxiv.org/abs/2006.07529
MIT License
735 stars 115 forks source link

What is the intended learning rate schedule? #16

Closed ChanghwaPark closed 3 years ago

ChanghwaPark commented 3 years ago

https://github.com/YyzHarry/imbalanced-semi-self/blob/16d8f02264d9e16602d1a47acc43053b6bb007c4/utils.py#L28-L39

Hi, thanks for sharing your code!

I have a question about the referenced code above. In the 'adjust_learning_rate' function, the lines 34 and 35 will never be passed. Can I ask the learning rate schedule that you used for experiments in the paper?

According to the 'adjust_learning_rate' function, the learning rate may change as follows.

epoch lr 0: args.lr 1 / 5
1: args.lr
2 / 5 2: args.lr 3 / 5 3: args.lr 4 / 5 4: args.lr 5 / 5 5 ~ 160: args.lr 161~: args.lr 0.01

YyzHarry commented 3 years ago

Hi - thanks for your interests! This should be a typo when I cleaned up the code (have updated it). For experiments on CIFAR/SVHN, the lr decay is by 0.01 at both the 160th and the 180th epoch.