time-series-foundation-models / lag-llama

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Apache License 2.0
1.27k stars 157 forks source link

Repeat inputs during prediction #45

Open sudongwang-upc opened 7 months ago

sudongwang-upc commented 7 months ago

Why repeat here? The results generated after repeat (i.e. the parameters of the distribution) should be the same, right? What is the significance of this?

https://github.com/time-series-foundation-models/lag-llama/blob/9965e30faff6087992a13717ec2f3c2cc3967fd5/lag_llama/gluon/lightning_module.py#L221-L226

ashok-arjun commented 7 months ago

When sampling, we require that we generate self.model.num_parallel_samples at every sampling iteration. Hence, we repeat all the tensors that many times.

sudongwang-upc commented 7 months ago

When sampling, we require that we generate self.model.num_parallel_samples at every sampling iteration. Hence, we repeat all the tensors that many times.

Thank you for your response to this issue. Let me elaborate on my question again.

  1. The model output obtained from the same input should be the same.
  2. In lag llama, the distribution parameters of the output obtained from multiple identical inputs after repeating all the tensors should be the same, right?
  3. If the same distribution samples n times and n distributions with the same parameters are sampled once per distribution, the results should be consistent, right?
  4. Can the same effect be achieved by sampling self. model. num_parallel_samples times with the distribution parameters which is the output of model by single input instead of repeating the input?
ashok-arjun commented 7 months ago

Thanks for elaborating.

  1. Yes
  2. No, we sample from the distribution, which is a non-deterministic operation even given the same inputs. As we sample, say, 100 samples from the distribution, with the same input, we cover many modes of the distribution, which is what we want.
  3. No, please see the answer above
  4. It is done to be compatible with the Student's T class that the repetitions are done. You may very well write a Distribution class where the repetitions are done within the class during sampling, but we resorted to the library function for which we had to repeat the inputs manually.

Feel free to follow up! :)

SpeeeedLee commented 5 months ago

@ashok-arjun May I ask is there a quick way to extract the mean and std of the t-distribution that model predicts?