cure-lab / LTSF-Linear

[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
Apache License 2.0
2k stars 441 forks source link
aaai aaai2023 deep-learning forecasting forecasting-model linear-models pytorch time-series time-series-forecasting time-series-prediction

Are Transformers Effective for Time Series Forecasting? (AAAI 2023)

This repo is the official Pytorch implementation of LTSF-Linear: "Are Transformers Effective for Time Series Forecasting?".

Updates

Features

Besides LTSF-Linear, we provide five significant forecasting Transformers to re-implement the results in the paper.

Detailed Description

We provide all experiment script files in ./scripts: Files Interpretation
EXP-LongForecasting Long-term Time Series Forecasting Task
EXP-LookBackWindow Study the impact of different look-back window sizes
EXP-Embedding Study the effects of different embedding strategies

This code is simply built on the code base of Autoformer. We appreciate the following GitHub repos a lot for their valuable code base or datasets:

The implementation of Autoformer, Informer, and Transformer is from https://github.com/thuml/Autoformer

The implementation of FEDformer is from https://github.com/MAZiqing/FEDformer

The implementation of Pyraformer is from https://github.com/alipay/Pyraformer

LTSF-Linear

LTSF-Linear family

image LTSF-Linear is a set of linear models.

Although LTSF-Linear is simple, it has some compelling characteristics:

Comparison with Transformers

Univariate Forecasting: image Multivariate Forecasting: image LTSF-Linear outperforms all transformer-based methods by a large margin.

Efficiency

image Comparison of method efficiency with Look-back window size 96 and Forecasting steps 720 on Electricity. MACs are the number of multiply-accumulate operations. We use DLinear for comparison since it has the double cost in LTSF-Linear. The inference time averages 5 runs.

Getting Started

Environment Requirements

First, please make sure you have installed Conda. Then, our environment can be installed by:

conda create -n LTSF_Linear python=3.6.9
conda activate LTSF_Linear
pip install -r requirements.txt

Data Preparation

You can obtain all the nine benchmarks from Google Drive provided in Autoformer. All the datasets are well pre-processed and can be used easily.

mkdir dataset

Please put them in the ./dataset directory

Training Example

For example:

To train the LTSF-Linear on Exchange-Rate dataset, you can use the script scripts/EXP-LongForecasting/Linear/exchange_rate.sh:

sh scripts/EXP-LongForecasting/Linear/exchange_rate.sh

It will start to train DLinear by default, the results will be shown in logs/LongForecasting. You can specify the name of the model in the script. (Linear, DLinear, NLinear)

All scripts about using LTSF-Linear on long forecasting task is in scripts/EXP-LongForecasting/Linear/, you can run them in a similar way. The default look-back window in scripts is 336, LTSF-Linear generally achieves better results with longer look-back window as dicussed in the paper.

Scripts about look-back window size and long forecasting of FEDformer and Pyraformer are in FEDformer/scripts and Pyraformer/scripts, respectively. To run them, you need to first cd FEDformer or cd Pyraformer. Then, you can use sh to run them in a similar way. Logs will be stored in logs/.

Each experiment in scripts/EXP-LongForecasting/Linear/ takes 5min-20min. For other Transformer scripts, since we put all related experiments in one script file, directly running them will take 8 hours per day. You can keep the experiments you are interested in and comment on the others.

Weights Visualization

As shown in our paper, the weights of LTSF-Linear can reveal some characteristics of the data, i.e., the periodicity. As an example, we provide the weight visualization of DLinear in weight_plot.py. To run the visualization, you need to input the model path (model_name) of DLinear (the model directory in ./checkpoint by default). To obtain smooth and clear patterns, you can use the initialization we provided in the file of linear models.

image

Citing

If you find this repository useful for your work, please consider citing it as follows:

@inproceedings{Zeng2022AreTE,
  title={Are Transformers Effective for Time Series Forecasting?},
  author={Ailing Zeng and Muxi Chen and Lei Zhang and Qiang Xu},
  journal={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2023}
}

Please remember to cite all the datasets and compared methods if you use them in your experiments.