Open dmccloskey opened 5 years ago
Methods to allow for increasing or decreasing the learning rate based on user defined criteria during training
ModelTrainer
lr = lr0/(1+kt) lr *= (1. / (1. + self.decay * self.iterations))
lr = lr0 * drop^floor(epoch / epochs_drop) def step_decay(epoch): initial_lrate = 0.1 drop = 0.5 epochs_drop = 10.0 lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop)) return lrate
lr = lr0 * e^(−kt) def exp_decay(epoch): initial_lrate = 0.1 k = 0.1 lrate = initial_lrate * exp(-k*t) return lrate
Description
Methods to allow for increasing or decreasing the learning rate based on user defined criteria during training
Objectives
ModelTrainer
ModelTrainer
ModelTrainer
ModelTrainer
ModelTrainer
Validation
Rerences
Time-based decay
Step decay
Exponential decay