thuml / Autoformer

About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
MIT License
2k stars 429 forks source link

Arima Baseline Model #194

Closed pohlchri closed 1 year ago

pohlchri commented 1 year ago

Regarding the Arima Baseline Model I understand that you use pmdarima package and refit the model for every test batch e.g. every 48 values if this is the prediction length. However, can you give a bit more detail how you did the model training? Do you use some fixed input length for the Arima model to train on, if so which length? Otherwise in my experience this rolling window approach gets very time consuming. I would highly appreciate your help!

wuhaixu2016 commented 1 year ago

Hi, we do adopt the rolling window approach to eval Arima baseline. It does take a lot of time.