maxjcohen / transformer

Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
https://timeseriestransformer.readthedocs.io/en/latest/
GNU General Public License v3.0
842 stars 165 forks source link

how to apply this transformer model to a long univariate time series data #42

Closed Dezhao-Huang closed 3 years ago

Dezhao-Huang commented 3 years ago

Dear all, First of all, thank you Max for sharing this amazing transformer implementation with us! I am able to successfully run the training.ipynb by following the #34 solutions using the x_train_LsAZgHU.csv and y_train_EFo1WyE.csv datasets. I have read the data descriptions on the Ozechallege_benchmark, for the input datasets(x_train_LsAZgHU.csv), there are 7500 rows and 12116 columns for the output datasets (y_train_EFo1WyE.csv), there are 7500 rows and 5377 columns each time series is 28 * 24 = 672

My question is If I have a long single-variable time series data, for example, a NumPy array length of [1,1000000], how to feed this long univariate time-series data to this transformer model? I would like to precisely predict the trend of this time series data.

how should I prepare the initial npz format dataset? how to set the d_input, d_output, and the batch size?

maxjcohen commented 3 years ago

Hi, answer in two parts: