Closed ghost closed 4 years ago
Hi! thanks for your contribution!, great first issue!
Hi In your lightning module, you could do this:
def on_epoch_start(self):
if self.loss_SUM > 0.3
self.trainer.optimizers[0] = Adam(...)
and you start with LBFGS as default, returned in configure_optimizers.
I think this logic can now better be done in configure_optimizers itself in case someone has some crazy schedulers, or schedulers_dict as well and calling:
def on_epoch_start(self):
if condition:
self.trainer.accelerator_backend.setup_optimizers(self)
def configure_optimizers(self):
if condition:
return Adam(...)
else:
return LBFGS(...)
Moved this to the forum https://forums.pytorchlightning.ai/t/how-to-switch-from-optimizer-during-training/219
Thanks @rohitgr7
Is possible to show how we should write the "configure_optimizers" and "training_step" functions for the following code. The purpose of the code is to switch the optimizer from LBFGS to Adam when the loss_SUM<0.3