NeuroDiffGym / neurodiffeq

A library for solving differential equations using neural networks based on PyTorch, used by multiple research groups around the world, including at Harvard IACS.
http://pypi.org/project/neurodiffeq/
MIT License
682 stars 89 forks source link

Adding the option to use a learning rate schedule during training. #223

Open dreivmeister opened 1 month ago

dreivmeister commented 1 month ago

I was thinking that adding the option to use a PyTorch learning rate scheduler could improve results and wont be hard to implement.

I guess one could do it like that:

parameters = [p for net in nets for p in net.parameters()]  # list of paramters of all networks
MY_LEARNING_RATE = 5e-3
optimizer = torch.optim.Adam(parameters, lr=MY_LEARNING_RATE, ...)
scheduler = ExponentialLR(optimizer, gamma=0.9)
solver = Solver1D(..., nets=nets, optimizer=optimizer, lr_scheduler=scheduler)

And then do a scheduler step after one train and valid epoch like that:

for local_epoch in loop:
            self.run_train_epoch()
            self.run_valid_epoch()
            if self.lr_scheduler: 
                     self.lr_scheduler.step()

Is this needed?