salesforce / ETSformer

PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
BSD 3-Clause "New" or "Revised" License
258 stars 40 forks source link
deep-learning exponential-smoothing forecasting interpretable-machine-learning pytorch time-series time-series-decomposition time-series-forecasting transformers

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting



Figure 1. Overall ETSformer Architecture.

Official PyTorch code repository for the ETSformer paper. Check out our blog post!

Requirements

  1. Install Python 3.8, and the required dependencies.
  2. Required dependencies can be installed by: pip install -r requirements.txt

Data

Usage

  1. Install the required dependencies.
  2. Download data as above, and place them in the folder, dataset/.
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder ./scripts, e.g. ./scripts/ETTm2.sh. You might have to change permissions on the script files by runningchmod u+x scripts/*.
  4. The script for grid search is also provided, and can be run by ./grid_search.sh.

Acknowledgements

The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the original authors for open-sourcing their work.

Citation

Please consider citing if you find this code useful to your research.

@article{woo2022etsformer,
    title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting},
    author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi},
    year={2022},
    url={https://arxiv.org/abs/2202.01381},
}