yuqinie98 / PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Apache License 2.0
1.51k stars 262 forks source link

Pretrained Models in Huggingface Repository #109

Open shisi-cc opened 5 months ago

shisi-cc commented 5 months ago

I came across the PatchTST pretrained models in Huggingface repository and I'm interested to know if these models are officially released by your team. If so, could you please provide guidance on how to use these models and possibly share some example code?