time-series-foundation-models / lag-llama

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Apache License 2.0
1.2k stars 146 forks source link

Custom small length dataset #33

Closed scarbain closed 5 months ago

scarbain commented 6 months ago

Hi !

Thank you for this foundation model, I've been trying to finetune it on a dataset of 10K timeseries but each timeseries only has 100 values (1min frequency and values ranging from 100 to -100). The timeseries are not following each other, so I can't append them.

Using your finetuning notebook, results are not good at all currently. What I'm interested in is predicting the last 20 values for each timeseries I feed the model so the context size should always be 80 + 20.

If I'm not mistaken, currently the finetuning script takes a random context inside each timeseries of the dataset right ? Do you think my dataset is something the model can be finetuned on or should I build a model from scratch for this task ?

Thank you very much for any insights :)

ashok-arjun commented 6 months ago

Hi, thanks for the detailed comment.

Note that the finetuning scripts are not out yet (apologies for the delay: I was on vacation and just got back, planning to releasing them in 2 weeks - hope that's OK). I can help debug and obtain better predictions once the finetuning script is out.

Regarding your dataset, can you please explain what you mean by "the time series are not following each other"? Do you mean they are irregularly spaced?

Since you have very less data in your task, I think the foundation model should perform better than a from-scratch model, but this could differ depending on the data. Nevertheless, I suggest obtaining metrics on the from-scratch model of your choice meanwhile.

ashok-arjun commented 5 months ago

Hi @scarbain Is this resolved?

scarbain commented 5 months ago

Hi @ashok-arjun, unfortunately I haven't had time to get back on this. I'll let you know if I do :)