Closed johnpaulbin closed 1 year ago
Adds learning rate decay to trainer/Tacotron2.py
New default hparams:
lr_decay_start=15000, # The global step to start decay lr_decay_rate=216000, # Rate which to decay. Equation: learning_rate = (self.learning_rate * (EulerNumber ** (-self.global_step / self.lr_decay_rate))) lr_decay_min=1e-5 # Minimum LR
learning_rate = (self.learning_rate * (EulerNumber ** (-self.global_step / self.lr_decay_rate)))
Adds learning rate decay to trainer/Tacotron2.py
New default hparams:
lr_decay_start=15000, # The global step to start decay lr_decay_rate=216000, # Rate which to decay. Equation:
learning_rate = (self.learning_rate * (EulerNumber ** (-self.global_step / self.lr_decay_rate)))
lr_decay_min=1e-5 # Minimum LR