BennyTMT / LLMsForTimeSeries

38 stars 7 forks source link

Great works #2

Closed WenWeiTHU closed 5 hours ago

WenWeiTHU commented 14 hours ago

Hi, thank you very much for your excellent work in revealing the problems of the prevalent LLM4TS method.

I'm one of the authors of AutoTimes: Autoregressive Time Series Forecasters via Large Language. In this work, we find that the issue may come from the inconsistency between the previous non-autoregressive forecasting approach and autoregressive LLMs, so we propose a simple autoregressive LLM4TS method, which effectively reduces the adaptation cost with refined performance.

We conducted ablation studies following your codebase and found that large language models as autoregressive forecasters are useful. We wonder if the rest of the ablation studies will be open-sourced so that the field can take a deep exploration of LLM4TS methods. Thanks!

BennyTMT commented 6 hours ago

Thank you for your work as well ! The method in your paper is different from the LLM4TS we are evaluating; Perhaps this could offer a good solution to the current issues, so good luck with your work!

The other ablations are relatively simple and are as follows:

  1. Few-shot: Reduce your training data to 10%.
  2. Sequential dependencies: Shuffle your time series order during inference.
  3. Randomly initialize your LLM parameters before training.

These implementations are not difficult and can be done within a few lines of code, though they may vary depending on the structure of each project. You may want to implement in your code with your style.

WenWeiTHU commented 5 hours ago

Thanks for your detailed instructions! Your work is instrumental in advancing the understanding and application of large language models in the time series community. Looking forward to the opportunity to engage in future discussions!