time-series-foundation-models / lag-llama

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Apache License 2.0
1.08k stars 121 forks source link

context_length fine-tuning #41

Closed arthur-b1 closed 2 months ago

arthur-b1 commented 2 months ago

Hi, thanks for your great work. I have a question about the suggested values for context_length, which seem to be powers of 2. Is there a specific reason for this choice, or could we use other values for context_length that are not powers of 2?

ashok-arjun commented 2 months ago

Hi! Thanks for the kind words.

We use powers of 2 just because that's the convention. You could use any value.

arthur-b1 commented 2 months ago

I see, thanks !