XDZhelheim / STAEformer

[CIKM'23] Official code for our paper "Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting".
https://arxiv.org/abs/2308.10425
132 stars 16 forks source link

STAEformer: Spatio-Temporal Adaptive Embedding Transformer

H. Liu, Z. Dong, R. Jiang#, J. Deng, J. Deng, Q. Chen, X. Song#, "Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting", Proc. of 32nd ACM International Conference on Information and Knowledge Management (CIKM), 2023. (*Equal Contribution, #Corresponding Author)

model_arch

Citation

@inproceedings{liu2023spatio,
  title={Spatio-temporal adaptive embedding makes vanilla transformer sota for traffic forecasting},
  author={Liu, Hangchen and Dong, Zheng and Jiang, Renhe and Deng, Jiewen and Deng, Jinliang and Chen, Quanjun and Song, Xuan},
  booktitle={Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
  pages={4125--4129},
  year={2023}
}

CIKM23 Proceedings (including METRLA, PEMSBAY, PEMS04, PEMS07, PEMS08 results)

https://dl.acm.org/doi/abs/10.1145/3583780.3615160

Preprints (including METRLA, PEMSBAY, PEMS03, PEMS04, PEMS07, PEMS08 results)

Arxiv link

Performance on Traffic Forecasting Benchmarks

PWC PWC PWC PWC PWC

perf1

image

Required Packages

pytorch>=1.11
numpy
pandas
matplotlib
pyyaml
pickle
torchinfo

Training Commands

cd model/
python train.py -d <dataset> -g <gpu_id>

<dataset>: