thuml / Large-Time-Series-Model

Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
https://arxiv.org/abs/2402.02368
MIT License
253 stars 19 forks source link

Questions on Unified Time Series Dataset and Model Training Resources #1

Closed wyl010607 closed 3 months ago

wyl010607 commented 4 months ago

Hello,

I'm truly impressed by your recent work on using pure decoder-based Transformer architectures for time series tasks. It’s great to see such innovative approaches being explored.

I have a couple of questions I’m hoping you could help with:

  1. Dataset Availability: You've mentioned the "Unified Time Series Dataset" in your research. Are there plans to make this dataset publicly available? If so, how and when can it be accessed?
  2. Resource Utilization for Model Training: You’ve discussed models of different sizes like 3M, 29M, and 51M parameters. Could you share details about the computational resources required for training these models? Specifically: What GPU resources were used? How long did the training take for each model size? Thanks for sharing your findings and I look forward to your response!

Best,

WenWeiTHU commented 4 months ago

Thank you for your attention.