zalandoresearch / pytorch-ts

PyTorch based Probabilistic Time Series forecasting framework based on GluonTS backend
MIT License
1.21k stars 191 forks source link

Inquiry about the default learning rate schedular #115

Open WonmoKoo opened 1 year ago

WonmoKoo commented 1 year ago

When I checked the trainer.py, I found that the default learning rate schedular is "OneCycleLR". I'm confused whether OneCycleLR was used in the paper below.

"Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting (ICML, 2021)"

In the Section 4.2 of the paper, you said that TimeGrad was trained with learning rate of 0.001. However, as far as I know, learning rate of OneCycleLR is initialized by the max_lr / div_factor and gradually increases toward max_lr. In other words, if TimeGrad is trained with "class Trainer" in trainer.py, the learning rates seem to have nothing to do with "0.001" you mentioned in the paper.

It would be very thankful if you answer my question: what learning rate scheduler was used in the ICML paper?