Closed amanchoudhri closed 1 year ago
Closes #26, closes #27.
Adds subdict LR_PARAMS to the OPTIMIZATION section of config files, expects param SCHEDULER. Note a breaking change: changes expected param name MAX_LEARNING_RATE to INITIAL_LEARNING_RATE.
LR_PARAMS
OPTIMIZATION
SCHEDULER
MAX_LEARNING_RATE
INITIAL_LEARNING_RATE
Example sequential optimization subconfig:
"OPTIMIZATION": { "NUM_EPOCHS": 500, "OPTIMIZER": "SGD", "MOMENTUM": 0.9, "WEIGHT_DECAY": 1e-05, "CLIP_GRADIENTS": true, "INITIAL_LEARNING_RATE": 0.02, "SCHEDULERS": [ { "SCHEDULER_TYPE": "COSINE_ANNEALING", "MIN_LEARNING_RATE": 0.0, "NUM_EPOCHS_ACTIVE": 100, }, { "SCHEDULER_TYPE": "EXPONENTIAL_DECAY", "MULTIPLICATIVE_DECAY_FACTOR": 0.5, }, ] }
Example subconfig if only one scheduler is desired:
"OPTIMIZATION": { "NUM_EPOCHS": 500, "OPTIMIZER": "SGD", "MOMENTUM": 0.9, "WEIGHT_DECAY": 1e-05, "CLIP_GRADIENTS": true, "INITIAL_LEARNING_RATE": 0.02, "SCHEDULERS": [ { "SCHEDULER_TYPE": "REDUCE_ON_PLATEAU", "MULTIPLICATIVE_DECAY_FACTOR": 0.5, "PLATEAU_DECAY_PATIENCE": 25, }, ] }
Closes #26, closes #27.
Adds subdict
LR_PARAMS
to theOPTIMIZATION
section of config files, expects paramSCHEDULER
. Note a breaking change: changes expected param nameMAX_LEARNING_RATE
toINITIAL_LEARNING_RATE
.Example sequential optimization subconfig:
Example subconfig if only one scheduler is desired: