Tony-Y / pytorch_warmup

Learning Rate Warmup in PyTorch
https://tony-y.github.io/pytorch_warmup/
MIT License
392 stars 25 forks source link

Can the warmup_scheduler update the learning rate every epoch and not every batch? #17

Closed talrub closed 8 months ago

talrub commented 8 months ago

Hi, If i want that the 'warmup_scheduler' will update the learning after every epoch and not after every batch, should i just do as follows ( using the dampening() after every epoch): for epoch in range(1,num_epochs+1): for idx, batch in enumerate(dataloader): optimizer.zero_grad() loss = ... loss.backward() optimizer.step() with warmup_scheduler.dampening(): lr_scheduler.step(epoch + idx / iters)

Thanks!
Tony-Y commented 8 months ago

I think you can do it using dampen() directly:

        optimizer.step()
        lr_scheduler.step(epoch + idx / iters)
        warmup_scheduler.dampen(epoch)

I don't have time to check code above, but it should work.

Tony-Y commented 8 months ago

I have just checked the code:

for epoch in range(epochs):
  for idx in range(steps_per_epoch):
    ...
    optimizer.step()
    lr_scheduler.step(epoch + idx / iters)
    warmup_scheduler.dampen(epoch)

https://gist.github.com/Tony-Y/ab0b1fb1af6dbb06b9c932214d954eea

Unknown

I would like to close this issue because there is no problem.