Tony-Y / pytorch_warmup

Learning Rate Warmup in PyTorch
https://tony-y.github.io/pytorch_warmup/
MIT License
386 stars 25 forks source link

How to use in `pytorch-lightning`? #8

Closed sieu-n closed 1 year ago

sieu-n commented 2 years ago

Thank you for a great implementation,

what do you think is the most appropriate way to use this library inside pytorch-lightning?

Tony-Y commented 2 years ago

I don't know how to use pytorch_warmup inside pytorch-lightning. But you could use it by reading pytorch-lightning docs: Bring your own Custom Learning Rate Schedulers

import torch
import pytorch_warmup as warmup

def configure_optimizers(self):
    optimizer = torch.optim.AdamW(self.model..parameters(), lr=0.001, betas=(0.9, 0.999), weight_decay=0.01)
    num_steps = len(self.train_dataloader) * self.num_epochs # I don't know whether this line is correct or not.
    lr_scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=num_steps)  
    warmup_scheduler = warmup.UntunedLinearWarmup(optimizer)
    return [optimizer], [{"scheduler": (lr_scheduler, warmup_scheduler), "interval": "step"}]

def lr_scheduler_step(self, scheduler, optimizer_idx, metric):
    lr_scheduler, warmup_scheduler = scheduler
    with warmup_scheduler.dampening():
            lr_scheduler.step()
Tony-Y commented 1 year ago

I have just done it on Colab: PyTorch Lightning with pytorch_warmup.ipynb