yuqinie98 / PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Apache License 2.0
1.51k stars 262 forks source link

On the Meaning of Parameter during Dataset Partition #71

Open ZhuLmumu opened 1 year ago

ZhuLmumu commented 1 year ago

Hello teacher, I noticed that you used parameters “seq_len”,“label_len" and ” pred_len" when partitioning the dataset in“class Dataset_Custom(Dataset): ”,and I don't quite understand their exact meanings, especially”label_len“. Could you please explain them?If you could provide an example, it would be even better. Looking forward to your reply. Thank you very much

ZhuLmumu commented 1 year ago

and about the parameters “seq_x_mark”,the shape of seq_x_mark is (seq_len,4),i think the“4” represents “month,day,weekday,hour”,But why is the data in each dimension equal?Just like this when seq_len=5 , seq_x_mark is [[ 0.19565217 -0.5 -0.43333333 0.00136986] [ 0.19565217 -0.5 -0.43333333 0.00136986] [ 0.19565217 -0.5 -0.43333333 0.00136986] [ 0.19565217 -0.5 -0.43333333 0.00136986] [ 0.19565217 -0.5 -0.43333333 0.00136986]]

yuqinie98 commented 1 year ago

Hi! We are using the same dataloader as the previous models (like Autoformer, FEDformer...) but since we are using encoder-only models, some parameters here which are related to the encoder-decoder model are actually not useful in PatchTST, e.g. label_len. Sorry for the confusion.

ZhuLmumu commented 1 year ago

Thank you for your answer!So what about "seq_len","pred_len", are them L & T?, and what about "seq_x_mark","seq_y_mark"?What have they gone through?

yuqinie98 commented 1 year ago

Yes. You could refer to this file for how we use the dataloader: https://github.com/yuqinie98/PatchTST/blob/d64fdb443e4d3103bcdfff7c955d7e26704f476f/PatchTST_supervised/exp/exp_main.py. seq_x_mark is not used in PatchTST supervised