Closed seyoulala closed 9 months ago
Hey @seyoulala thanks for your kind words! :)
I see a number of ways of doing it:
Using multiple trainers: to each trainer you can pass a lr_scheduler
(see here). Therefore you could do something like
model = WideDeep(...)
first_scheduler = torch.optim.lr_scheduler.StepLR(...)
warmup_trainer = Trainer(model, objective, lr_schedulers=first_scheduler)
# At this point the weights of the model are "warm"
trainer = Trainer(model, objective, ...). # with a 2nd scheduler if you wanted
Passing a Sequential LR scheduler: in principle, the Trainer
accepts any valid pytorch scheduler. Therefore, you could build your own sequential learning rate scheduler and pass it. If this throws an error let me know, because it would be a bug to fix. Personally, I have never tried it (but I will now :) )
And the last one, the fit
method allows for a series of warm up parameters 😉. At the moment I implemented two routines. Have a look here, to the finetune
parameter, aliased as warmup
. Also to this example notebook.
Let me know if you need any more info/help
thanks !build my own learning rate scheduler by SequentialLR of pytorch can deal this problem
okay, thanks @seyoulala
HI The package is amazing! there hava some question when i used package. i want to warmup lr when i used this to training the model but find Trainer class is no paramerter about warmup scheduler