WenjieDu / SAITS

The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
https://doi.org/10.1016/j.eswa.2023.119619
MIT License
319 stars 50 forks source link

Configs of ETTm1 #28

Closed JeremyChou28 closed 1 year ago

JeremyChou28 commented 1 year ago

Hello, Could you share the configuration settings on the ETTm1 dataset? Thanks!

WenjieDu commented 1 year ago

Hi there,

Thank you so much for your attention to SAITS! If you find SAITS is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can let others notice SAITS. It matters and is definitely a kind of contribution.

I have received your message and will respond ASAP. Thank you again for your patience! 😃

Best,
Wenjie

WenjieDu commented 1 year ago

Hi Jianping, thank you for opening this issue.

Our experiments on ETT dataset were appended during peer review and were done on a research group's GPU server from Tsinghua. After that we didn't copy the hyperparameter configs and the files are now cleaned. So I'm sorry that we don't have the exact config files. But I remember that the hyper-parameters of SAITS we use on ETT dataset and PhysioNet-2012 are the same. You can give it a try. And you can utilize the hyper-parameter searching mode in our scripts to gain the hyper-parameter configs for each model. I'm pretty sure you can reproduce our results, given some time.

Let me know if there's anything I can help with.

JeremyChou28 commented 1 year ago

Ok, I will have a try, thank you very much.

WenjieDu commented 1 year ago

With pleasure. If your research is related to (incomplete) time series, our new work PyPOTS (a Python toolbox for data mining on Partially-Observed Time Series, https://pypots.com) may be useful to you. It may deserve your attention 😊

Good luck with your research.