AntixK / PyTorch-VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.
Apache License 2.0
6.44k stars 1.05k forks source link

value error #84

Closed ShengJinhao closed 8 months ago

ShengJinhao commented 8 months ago

ValueError: The provided lr scheduler "<torch.optim.lr_scheduler.ExponentialLR object at 0x7f5cdc008280>" is invalid. I want to know that why the reason generate.I use the Python 3.8.Thanks.

JiangMingchen commented 8 months ago

Changing configure_optimizers in experiment.py as following works for me.

def configure_optimizers(self):

        optims = []
        scheds = []

        optimizer = optim.Adam(self.model.parameters(),
                               lr=self.params['LR'],
                               weight_decay=self.params['weight_decay'])
        optims.append(optimizer)
        # Check if more than 1 optimizer is required (Used for adversarial training)
        try:
            if self.params['LR_2'] is not None:
                optimizer2 = optim.Adam(getattr(self.model,self.params['submodel']).parameters(),
                                        lr=self.params['LR_2'])
                optims.append(optimizer2)
        except:
            pass

        try:
            if self.params['scheduler_gamma'] is not None:
                scheduler = optim.lr_scheduler.ExponentialLR(optims[0],
                                                             gamma = self.params['scheduler_gamma'])
                scheduler_config = {'scheduler': scheduler, 'interval': 'epoch' }
                scheds.append(scheduler_config)

                # Check if another scheduler is required for the second optimizer
                try:
                    if self.params['scheduler_gamma_2'] is not None:
                        scheduler2 = optim.lr_scheduler.ExponentialLR(optims[1],
                                                                      gamma = self.params['scheduler_gamma_2'])
                        scheduler_config2 = {'scheduler': scheduler2, 'interval': 'epoch' }
                        scheds.append(scheduler_config2)
                except:
                    pass
                return optims, scheds
        except:
            return optims

It seems that ExponentialLR needs to return in another format.

My reference is the official document of Pytorch Lightning: https://lightning.ai/docs/pytorch/1.5.6/api/pytorch_lightning.core.lightning.html

ShengJinhao commented 8 months ago

tanks.

闭眸。 @.***

 

------------------ 原始邮件 ------------------ 发件人: "AntixK/PyTorch-VAE" @.>; 发送时间: 2024年1月8日(星期一) 中午11:58 @.>; @.**@.>; 主题: Re: [AntixK/PyTorch-VAE] value error (Issue #84)

Changing configure_optimizers in experiment.py as following works for me. def configure_optimizers(self): optims = [] scheds = [] optimizer = optim.Adam(self.model.parameters(), lr=self.params['LR'], weight_decay=self.params['weight_decay']) optims.append(optimizer) # Check if more than 1 optimizer is required (Used for adversarial training) try: if self.params['LR_2'] is not None: optimizer2 = optim.Adam(getattr(self.model,self.params['submodel']).parameters(), lr=self.params['LR_2']) optims.append(optimizer2) except: pass try: if self.params['scheduler_gamma'] is not None: scheduler = optim.lr_scheduler.ExponentialLR(optims[0], gamma = self.params['scheduler_gamma']) scheduler_config = {'scheduler': scheduler, 'interval': 'epoch' } scheds.append(scheduler_config) # Check if another scheduler is required for the second optimizer try: if self.params['scheduler_gamma_2'] is not None: scheduler2 = optim.lr_scheduler.ExponentialLR(optims[1], gamma = self.params['scheduler_gamma_2']) scheduler_config2 = {'scheduler': scheduler2, 'interval': 'epoch' } scheds.append(scheduler_config2) except: pass return optims, scheds except: return optims
It seems that ExponentialLR needs to return in another format.

My reference is the official document of Pytorch Lightning: https://lightning.ai/docs/pytorch/1.5.6/api/pytorch_lightning.core.lightning.html

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>