WXinlong / DenseCL

Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021 Oral.
GNU General Public License v3.0
544 stars 70 forks source link

Details about loss_lambda warmup #16

Closed alohays closed 3 years ago

alohays commented 3 years ago

Thank you for your great work. Could you give the implementation detail or code of the loss_lambda warmup setting stated in the DenseCL paper?

WXinlong commented 3 years ago

Hi, it is implemented as a hook. I have uploaded it to this repo. Please refer to hooks/densecl_warmup_hook.py for details.

For usage, you just need to add this to your config file: custom_hooks = [ dict(type='DenseCLWarmupHook', start_iters=10000) ]