WenjieDu / SAITS

The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
https://doi.org/10.1016/j.eswa.2023.119619
MIT License
319 stars 50 forks source link

Question about output of the first DMSA #35

Closed miten073 closed 4 months ago

miten073 commented 5 months ago

Hello, I want to ask you about the saits.py part of the modeling in your code, I only used the first DMSA module, I also entered the X and Miss Mask in your way, but after going through the encoder layer, data becomes all Nan, what is the reason for this situation. Looking forward to your reply

WenjieDu commented 5 months ago

Hi there,

Thank you so much for your attention to SAITS! If you find SAITS is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can let others notice SAITS. It matters and is definitely a kind of contribution.

I have received your message and will respond ASAP. Thank you again for your patience! 😃

Best,
Wenjie

WenjieDu commented 5 months ago

This probably is caused by a bug in your code.

Thanks for your attention to our work. You can follow me on GitHub to receive the latest news about PyPOTS and our open science research.

miten073 commented 5 months ago

Thank you for your answer, but my question still exists, I suspect my input is wrong.

  1. Is the 'x' that entered the first DMSA block contains the nan value? 2.does the 'miss_mask' mark the observed value as 1 and the missing value as 0? 3.If the input is correct, and after the embedding layers, the lines containing 'nan' will all become 'nan', what should I do with it? Looking forward to your answer
github-actions[bot] commented 5 months ago

This issue had no activity for 14 days. It will be closed in 1 week unless there is some new activity. Is this issue already resolved?