WenjieDu / SAITS

The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
https://doi.org/10.1016/j.eswa.2023.119619
MIT License
290 stars 48 forks source link

Training stage of an attention based model! #40

Open kalluarjun69 opened 1 week ago

kalluarjun69 commented 1 week ago

Greetings Wenjie, I was very much impressed aby your work "SAITS". I am trying to create an attention-based model on my own as a part of my Bacholer's project and I have a few questions to ask: I wanted to know how many trials did you run the SAITS in the training stage? I noticed you've already answered a similar issue to user to the user "Rajesh90123" where you said "just let the experiment run till I thought it was good to stop". Just for reference, could you please tell em how many trials did you run during the hyperparameter tuning till you realised it was good enough to stop? Was it in range of 100s, 1000s, 10,000s or more till you thought it was good enough? Please let me know at the earliest of your convenience. Thank you!

WenjieDu commented 1 week ago

Hi there,

Thank you so much for your attention to SAITS! If you find SAITS is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can let others notice SAITS. It matters and is definitely a kind of contribution.

I have received your message and will respond ASAP. Thank you again for your patience! 😃

Best,
Wenjie