A library for solving differential equations using neural networks based on PyTorch, used by multiple research groups around the world, including at Harvard IACS.
I was thinking that adding the option to use a PyTorch learning rate scheduler could improve results and wont be hard to implement.
I guess one could do it like that:
parameters = [p for net in nets for p in net.parameters()] # list of paramters of all networks
MY_LEARNING_RATE = 5e-3
optimizer = torch.optim.Adam(parameters, lr=MY_LEARNING_RATE, ...)
scheduler = ExponentialLR(optimizer, gamma=0.9)
solver = Solver1D(..., nets=nets, optimizer=optimizer, lr_scheduler=scheduler)
And then do a scheduler step after one train and valid epoch like that:
for local_epoch in loop:
self.run_train_epoch()
self.run_valid_epoch()
if self.lr_scheduler:
self.lr_scheduler.step()
I was thinking that adding the option to use a PyTorch learning rate scheduler could improve results and wont be hard to implement.
I guess one could do it like that:
And then do a scheduler step after one train and valid epoch like that:
Is this needed?