yoshall / AirFormer

PyTorch implementation of AirFormer, AAAI-23
99 stars 22 forks source link

AirFormer

This repo is the implementation of our manuscript entitled AirFormer: Predicting Nationwide Air Quality in China with Transformers. The code is based on Pytorch 1.10, and tested on Ubuntu 16.04 with a NVIDIA RTX A6000 GPU with 48 GB memory.

In this study, we present a novel Transformer architecture termed AirFormer to collectively predict nationwide air quality in China over the next 72 hours, with an unprecedented fine spatial granularity covering thousands of locations.

Framework

Requirements

AirFormer uses the following dependencies:

Dataset

The air quality (and meteorology) dataset used in our paper is larger than 500GB. We process a tiny version of our dataset to facilitate the understanding of the source code. This tiny version consists of 20 training instances, 20 validation instances, and 20 test instances. Please unzip the files under the data folder before running our code.

Folder Structure

We list the code of the major modules as follows:

Arguments

We introduce some major arguments of our main function here.

Training settings:

Model hyperparameters:

Model Training

Before running our code, please add the path of this repo to PYTHONPATH.

export PYTHONPATH=$PYTHONPATH:"the path of this repo"

The following examples are conducted on the tiny dataset:

Model Test

To test above trained models, you can use the following command to run our code:

Citation

If you find our work useful in your research, please cite:

@article{liang2022airformer,
  title={AirFormer: Predicting Nationwide Air Quality in China with Transformers},
  author={Liang, Yuxuan and Xia, Yutong and Ke, Songyu and Wang, Yiwei and Wen, Qingsong and Zhang, Junbo and Zheng, Yu and Zimmermann, Roger},
  journal={arXiv preprint arXiv:2211.15979},
  year={2022}
}