Open amotl opened 8 months ago
Adaptive Time-Frequency Ensembled Network for Long-term Time Series Forecasting.
A simple repository for training time series large observation models. This repository began its life as Andrej Karpathy's nanoGPT, and has been altered so that it is usable for time series data.
TinyTimeMixers (TTMs) are compact pre-trained models for Multivariate Time-Series Forecasting, open-sourced by IBM Research. With less than 1 Million parameters, TTM introduces the notion of the first-ever “tiny” pre-trained models for Time-Series Forecasting.
The current open-source version supports point forecasting use-cases ranging from minutely to hourly resolutions (Ex. 10 min, 15 min, 1 hour, etc.). Note that zeroshot, fine-tuning and inference tasks using TTM can easily be executed in 1 GPU machine or in laptops too.
TTM-1 currently supports 2 modes:
Since, TTM models are extremely small and fast, it is practically very easy to finetune the model with your available target data in few minutes to get more accurate forecasts. For more details on TTM architecture and benchmarks, refer to our paper.
HF: https://huggingface.co/ibm/TTM Paper: https://arxiv.org/pdf/2401.03955.pdf Repository: https://github.com/IBM/tsfm/tree/main/tsfm_public/models/tinytimemixer
NeuralForecast offers a large collection of neural forecasting models focused on their usability, and robustness. The models range from classic networks like
MLP
,RNN
s to novel proven contributions likeNBEATS
,NHITS
,TFT
and other architectures.
Web: https://nixtlaverse.nixtla.io/neuralforecast/ Repository: https://github.com/Nixtla/neuralforecast
A professionally curated list of papers (with available code), tutorials, and surveys on recent AI for Time Series Analysis (AI4TS), including Time Series, Spatio-Temporal Data, Event Data, Sequence Data, Temporal Point Processes, etc., at the Top AI Conferences and Journals, which is updated ASAP (the earliest time) once the accepted papers are announced in the corresponding top AI conferences/journals. Hope this list would be helpful for researchers and engineers who are interested in AI for Time Series Analysis.
Repository: https://github.com/qingsongedu/awesome-AI-for-time-series-papers
We propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning. It is based on two key components: (i) segmentation of time series into subseries-level patches which are served as input tokens to Transformer; (ii) channel-independence where each channel contains a single univariate time series that shares the same embedding and Transformer weights across all the series. Patching design naturally has three-fold benefit: local semantic information is retained in the embedding; computation and memory usage of the attention maps are quadratically reduced given the same look-back window; and the model can attend longer history.
Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA Transformer-based models. We also apply our model to self-supervised pre-training tasks and attain excellent fine-tuning performance, which outperforms supervised training on large datasets. Transferring of masked pre-trained representation on one dataset to others also produces SOTA forecasting accuracy.
Paper: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers Repository: https://github.com/yuqinie98/PatchTST
State-of-the-art Deep Learning library for Time Series and Sequences.
tsai
is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series tasks like classification, regression, forecasting, imputation…
tsai
is currently under active development by timeseriesAI.
Repository: https://github.com/timeseriesAI/tsai Documentation: https://timeseriesai.github.io/tsai/
Time series forecasting with scikit-learn models.
Skforecast is a Python library that eases using scikit-learn regressors as single and multi-step forecasters. It also works with any regressor compatible with the scikit-learn API (LightGBM, XGBoost, CatBoost, ...).
Homepage: https://skforecast.org/ Repository: https://github.com/JoaquinAmatRodrigo/skforecast
About
At ^1, we shared a few notes about time series anomaly detection, and forecasting/prediction. Other than using traditional statistics-based time series forecasting methods like Holt-Winters or ARIMA, and libraries like Prophet and friends, other kinds of prediction methods are emerging, based on machine learning models and outcomes from deep learning operations, like TimeGPT, or Chronos, that allow for zero-shot inference.
TimeGPT-1
Chronos: Learning the Language of Time Series
Chronos is a family of pretrained time series forecasting models based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.