huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.79k stars 26.46k forks source link

[Time-Series] time-series patching #22075

Closed elisim closed 1 year ago

elisim commented 1 year ago

Model description

"time-series patching" refers to the process of segmentation the series into subseries-level patches which are served as input tokens to the transformer. It's really similar to what's done in ViT, but for time-series. This idea was first propsed in a recent ICLR paper:

A Time Series is Worth 64 Words: Long-term Forecasting with Transformers code: https://github.com/yuqinie98/PatchTST

@kashif @NielsRogge

Open source status

Provide useful links for the implementation

@yuqinie98

Edit: I think that "new model" is not the best label to this issue, maybe there is a better label for this?

elisim commented 1 year ago

PatchTST is in Gluon, thanks to @kashif . Closing here :) https://github.com/awslabs/gluonts/pull/2748

yuqinie98 commented 1 year ago

Oh I apologize that I just notice this issue. Somehow I haven't seen it previously... And I appreciate so much for your attention on the PatchTST! Anything I could do for it? Also, glad to see the Gluonts application!

kashif commented 1 year ago

@yuqinie98 not a problem... i will get PatchTST added to transformers next

kashif commented 1 year ago

@yuqinie98 also note that the model on gluonts is not exactly the patchTST implementation as your paper (or the implementations in tsai and neuralforecast):