open-mmlab / mmgeneration

MMGeneration is a powerful toolkit for generative models, based on PyTorch and MMCV.
https://mmgeneration.readthedocs.io/en/latest/
Apache License 2.0
1.9k stars 227 forks source link

Separate lr_configs for generator optimizer and discriminator optimizer #280

Open eehitray opened 2 years ago

eehitray commented 2 years ago

Is there a way to specify a separate lr_config for the generator optimizer, and a separate lr_config for the discriminator optimizer? From my understanding, the lr_config specified in the config file applies to both the generator and the discriminator optimizer.

plyfager commented 2 years ago

Sure, you can set a separate learning rate in the optimizer. Example: https://github.com/open-mmlab/mmgeneration/blob/master/configs/_base_/models/biggan/biggan_128x128.py#L30-L32

eehitray commented 2 years ago

I understand that; however, I wish to set up a separate lr_config; for example, using CosineAnnealing for the generator but no annealing for the discriminator. Is this possible?

plyfager commented 2 years ago

Got your point, this feature is not supported now. We are currently working on a refactor which covers this feature, probably done by the end of June.

plyfager commented 2 years ago

In 1.x, you can set parameter_scheduler like this for your purpose.

param_scheduler = dict(
generator = dict(
    type='LinearLrInterval',  xxx),
discriminator = dict(type=xxx)
)