dmccloskey / EvoNet

MIT License
2 stars 0 forks source link

Learning rate schedulers #93

Open dmccloskey opened 5 years ago

dmccloskey commented 5 years ago

Description

Methods to allow for increasing or decreasing the learning rate based on user defined criteria during training

Objectives

Validation

Rerences

Time-based decay

lr = lr0/(1+kt)
lr *= (1. / (1. + self.decay * self.iterations))

Step decay

lr = lr0 * drop^floor(epoch / epochs_drop)
def step_decay(epoch):
   initial_lrate = 0.1
   drop = 0.5
   epochs_drop = 10.0
   lrate = initial_lrate * math.pow(drop,  
           math.floor((1+epoch)/epochs_drop))
   return lrate

Exponential decay

lr = lr0 * e^(−kt)
def exp_decay(epoch):
   initial_lrate = 0.1
   k = 0.1
   lrate = initial_lrate * exp(-k*t)
   return lrate