time-series-foundation-models / lag-llama

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Apache License 2.0
1.08k stars 121 forks source link

Negative training loss #40

Closed MCamorphous closed 2 months ago

MCamorphous commented 2 months ago

Is it normal to have negative training losses? loss

ashok-arjun commented 2 months ago

Hi! Yes, as per our implementation, we minimize the negative log likelihood, and we call this the "loss". That's why you (only) get negative training/validation losses.

Hope this clarifies! Feel free to follow-up if there's more questions!