Closed HydrogenSulfate closed 3 years ago
lr = cfg.lr * (cfg.gamma ** -(epoch // cfg.n_steps)) cfg.gamma = 0.1, so the lr might become larger during training?
Hi @HydrogenSulfate , this bug has been fixed.
lr = cfg.lr * (cfg.gamma ** -(epoch // cfg.n_steps)) cfg.gamma = 0.1, so the lr might become larger during training?