Closed ghost closed 3 years ago
Have a look here https://pytorch-lightning.readthedocs.io/en/stable/optimizers.html And here: https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_optimizers
You can return a dictionary like this:
# example with optimizer frequencies
# see training procedure in `Improved Training of Wasserstein GANs`, Algorithm 1
# https://arxiv.org/abs/1704.00028
def configure_optimizers(self):
gen_opt = Adam(self.model_gen.parameters(), lr=0.01)
dis_opt = Adam(self.model_disc.parameters(), lr=0.02)
n_critic = 5
return (
{'optimizer': dis_opt, 'frequency': n_critic},
{'optimizer': gen_opt, 'frequency': 1}
)
Hope this helps
If this doesn't fully answer your question, please let me know and I will reopen.
Hi Adrian, thanks for your reply. This definitely answers my question. I really appreciate it.
I have two other questions that I'd appreciate if you help me resolve those issues, as well:
We are in the process of developing a pytorch-lightening based script to solve PDE and at this point because of not having a good understanding of how to use LBFGS optimizer we do not know what to do.
Regards, Peyman
On Sun, Sep 27, 2020 at 9:07 PM Adrian Wälchli notifications@github.com wrote:
If this doesn't fully answer your question, please let me know and I will reopen.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/PyTorchLightning/pytorch-lightning/issues/3665#issuecomment-699757813, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJWTH3I6CJTJ6IFG6ZPRD3TSIADY7ANCNFSM4R2SPQDQ .
Have a look here https://pytorch-lightning.readthedocs.io/en/stable/optimizers.html And here: https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_optimizers
You can return a dictionary like this:
# example with optimizer frequencies # see training procedure in `Improved Training of Wasserstein GANs`, Algorithm 1 # https://arxiv.org/abs/1704.00028 def configure_optimizers(self): gen_opt = Adam(self.model_gen.parameters(), lr=0.01) dis_opt = Adam(self.model_disc.parameters(), lr=0.02) n_critic = 5 return ( {'optimizer': dis_opt, 'frequency': n_critic}, {'optimizer': gen_opt, 'frequency': 1} )
Hope this helps
hi, i tried to cobine this with lr scheduler and pass it to opitmizer and i get this error saying the generator loss is missing
raise MisconfigurationException(
pytorch_lightning.utilities.exceptions.MisconfigurationException: ReduceLROnPlateau conditioned on metric 0_Generator_Loss which is not available. Available metrics are: ['4_Discriminator_fake_Loss', '5_Discriminator_real_Loss', '6_Aux_loss', '7_accruracy', '1_Discriminator_Loss']. Condition can be set using monitor
key in lr scheduler dict
Is there a more efficient method to train discriminator for 5 times and the
generator for only one time?
def configure_optimizers(self): genOptim=optim.Adam(self.Generator.parameters(),lr=self.hparams.lr_gen) disOptim=optim.Adam(self.Discriminator.parameters(),lr=self.hparams.lr_dis) return [disOptim,disOptim,disOptim,disOptim,disOptim,genOptim],[] #5 times training of discriminator def training_step(self,train_batch,batch_idx,optimizer_idx): if optimizer_idx<5:#disOptim
...
if optimizer_idx==5:#genOptim
...