Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
27.98k stars 3.35k forks source link

training discriminator five time for WGAN and generator only one #3665

Closed ghost closed 3 years ago

ghost commented 3 years ago

Is there a more efficient method to train discriminator for 5 times and the

generator for only one time?

def configure_optimizers(self): genOptim=optim.Adam(self.Generator.parameters(),lr=self.hparams.lr_gen) disOptim=optim.Adam(self.Discriminator.parameters(),lr=self.hparams.lr_dis) return [disOptim,disOptim,disOptim,disOptim,disOptim,genOptim],[] #5 times training of discriminator def training_step(self,train_batch,batch_idx,optimizer_idx): if optimizer_idx<5:#disOptim

...

return {'loss':loss_dis,'loss_dis':loss_dis}

if optimizer_idx==5:#genOptim

...

return {'loss':loss_gen,'loss_gen':loss_gen}
awaelchli commented 3 years ago

Have a look here https://pytorch-lightning.readthedocs.io/en/stable/optimizers.html And here: https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_optimizers

You can return a dictionary like this:

# example with optimizer frequencies
# see training procedure in `Improved Training of Wasserstein GANs`, Algorithm 1
# https://arxiv.org/abs/1704.00028
def configure_optimizers(self):
    gen_opt = Adam(self.model_gen.parameters(), lr=0.01)
    dis_opt = Adam(self.model_disc.parameters(), lr=0.02)
    n_critic = 5
    return (
        {'optimizer': dis_opt, 'frequency': n_critic},
        {'optimizer': gen_opt, 'frequency': 1}
    )

Hope this helps

awaelchli commented 3 years ago

If this doesn't fully answer your question, please let me know and I will reopen.

ghost commented 3 years ago

Hi Adrian, thanks for your reply. This definitely answers my question. I really appreciate it.

I have two other questions that I'd appreciate if you help me resolve those issues, as well:

  1. How to use LBFGS in Pytorch-Lightening #3672, https://github.com/PyTorchLightning/pytorch-lightning/issues/3672 https://meet.google.com/linkredirect?authuser=0&dest=https%3A%2F%2Fgithub.com%2FPyTorchLightning%2Fpytorch-lightning%2Fissues%2F3672
  2. switch from LBFGS to ADAM optimizer during the training loop #3664 , https://github.com/PyTorchLightning/pytorch-lightning/issues/3664 https://meet.google.com/linkredirect?authuser=0&dest=https%3A%2F%2Fgithub.com%2FPyTorchLightning%2Fpytorch-lightning%2Fissues%2F3664

We are in the process of developing a pytorch-lightening based script to solve PDE and at this point because of not having a good understanding of how to use LBFGS optimizer we do not know what to do.

Regards, Peyman

On Sun, Sep 27, 2020 at 9:07 PM Adrian Wälchli notifications@github.com wrote:

If this doesn't fully answer your question, please let me know and I will reopen.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/PyTorchLightning/pytorch-lightning/issues/3665#issuecomment-699757813, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJWTH3I6CJTJ6IFG6ZPRD3TSIADY7ANCNFSM4R2SPQDQ .

msverma101 commented 1 year ago

Have a look here https://pytorch-lightning.readthedocs.io/en/stable/optimizers.html And here: https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.configure_optimizers

You can return a dictionary like this:

# example with optimizer frequencies
# see training procedure in `Improved Training of Wasserstein GANs`, Algorithm 1
# https://arxiv.org/abs/1704.00028
def configure_optimizers(self):
    gen_opt = Adam(self.model_gen.parameters(), lr=0.01)
    dis_opt = Adam(self.model_disc.parameters(), lr=0.02)
    n_critic = 5
    return (
        {'optimizer': dis_opt, 'frequency': n_critic},
        {'optimizer': gen_opt, 'frequency': 1}
    )

Hope this helps

hi, i tried to cobine this with lr scheduler and pass it to opitmizer and i get this error saying the generator loss is missing

raise MisconfigurationException(

pytorch_lightning.utilities.exceptions.MisconfigurationException: ReduceLROnPlateau conditioned on metric 0_Generator_Loss which is not available. Available metrics are: ['4_Discriminator_fake_Loss', '5_Discriminator_real_Loss', '6_Aux_loss', '7_accruracy', '1_Discriminator_Loss']. Condition can be set using monitor key in lr scheduler dict