ant-research / Pyraformer

Apache License 2.0
237 stars 37 forks source link

Informer achieves lower MSEs in their own paper #22

Closed limjcst closed 1 year ago

limjcst commented 1 year ago

Taking the Electricity dataset (named ECL in the Informer paper) as an example, Informer reported an MSE of 0.540 for 720-step prediction, while the Pyraformer reported an MSE of 4.365 for Informer. What's more, another paper (Autoformer published in NIPS'21) reported an MSE of 0.373 for Informer, which is even lower than the Informer paper. The same thing happens for LogTrans.

I am wondering,

Zhazhan commented 1 year ago

Informer and Pyraformer preprocess the electricity dataset in different ways. We followed DeepAR in preprocessing the electricity dataset (the scripts are provided in this repo), while Informer appears to preprocess it in another way. As a result, the datasets we use are quite different. We re-ran Informer on our preprocessed data and reported the results.

limjcst commented 1 year ago

Thanks for your quick response!