automl / Auto-PyTorch

Automatic architecture search and hyperparameter optimization for PyTorch
Apache License 2.0
2.37k stars 287 forks source link

[bug] LR Scheduler updating at wrong times #211

Closed ravinkohli closed 3 years ago

ravinkohli commented 3 years ago

Hey, LR Schedulers dependent on their type should be updated either epoch-wise or batch-wise. Currently, they are only happening batch-wise which is incompatible with CosineAnnealingWarmRestarts, see here.

One possible solution to this, make {'step'='epochs'} property of each scheduler, and use this to decide in the base trainer when to make a step.

I think this is an urgent fix needed.

ArlindKadra commented 3 years ago

yeah, I think quite probably the results of the new autopytorch will even be better than the old one now ;) We should delegate the fix from the cocktail branch as fast as possible

ArlindKadra commented 3 years ago

Especially considering the range of T_0is from 0 to 20