A framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.
As described in the README, data such as S-1 and D-1 are clearly pre-scaled according to the min/max in the test set.
But data such as E-3 and E-8 have different value ranges in the training set and in normal sequences of the test set.
Is this difference in value range caused by a change of telemetry behavior in the test set, or by the fact that the training set was pre-scaled according to the min/max in the training set?
As described in the README, data such as S-1 and D-1 are clearly pre-scaled according to the min/max in the test set. But data such as E-3 and E-8 have different value ranges in the training set and in normal sequences of the test set.
Is this difference in value range caused by a change of telemetry behavior in the test set, or by the fact that the training set was pre-scaled according to the min/max in the training set?