Open Shiweiliuiiiiiii opened 2 years ago
Dear authors,
Do I have to use ''LearningRateDecayOptimizerConstructor'' to reproduce the results of ConvNeXt on segmentation?
Can I use the default one but with the same decay parameters like this?
optimizer = dict(delete=True, type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, paramwise_cfg={'decay_rate': 0.9, 'decay_type': 'stage_wise', 'num_layers': 12})
Many thanks, Shiwei
any updates?
Dear authors,
Do I have to use ''LearningRateDecayOptimizerConstructor'' to reproduce the results of ConvNeXt on segmentation?
Can I use the default one but with the same decay parameters like this?
optimizer = dict(delete=True, type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, paramwise_cfg={'decay_rate': 0.9, 'decay_type': 'stage_wise', 'num_layers': 12})
Many thanks, Shiwei