Closed MCamorphous closed 2 months ago
Hi! Yes, as per our implementation, we minimize the negative log likelihood, and we call this the "loss". That's why you (only) get negative training/validation losses.
Hope this clarifies! Feel free to follow-up if there's more questions!
Is it normal to have negative training losses?![loss](https://github.com/time-series-foundation-models/lag-llama/assets/130282119/9ab45426-4ed5-4b28-8f52-2be6788bde6b)