-
When I run "forecasts, tss = get_lag_llama_predictions(backtest_dataset, prediction_length, device, num_samples)" with my own data...I'm having an invalid frequency error. Kindly help....
Error:
:…
-
Hi lag-llama team,
Thank you for developing this project, it's a highly valuable resource.
I wasn't able to find the lag-llama package on PyPI, which makes it a bit challenging to integrate into …
-
### Description
Amazon Chronos-T5: https://huggingface.co/amazon/chronos-t5-large
IBM Granite: https://huggingface.co/ibm-granite/granite-timeseries-ttm-v1
Lag-Llama: https://huggingface.co/time-…
-
Hello Lag Llama team,
I would like to know if Lag Llama models take dates into account when fine-tuning the model and making predictions.
To clarify, I am asking this because I need to understand…
-
FileNotFoundError: [Errno 2] No such file or directory: 'datasets\\australian_elect
ricity_demand\\metadata.json'
-
Hello, it's Arthur,
Thank you for your great work.
I would like to ask whether it is possible to get the specific lag indices you use during the pre-training or zero-shot phases.
In the Colab tut…
-
Hi Lag-Llama team,
I would like to know if I can use this model for analyzing future customer trends based on feedback data from the past three months.
To clarify, I have customer feedback data …
-
I reported in another issue that the most recent `pytorch-lightning` does not work with `lag-llama`. I also tried a few version combinations among pytorch, pytorch-lightning, and gluonts. Eventually I…
-
During fine-tuning the models are automatically saved, the best being, say
\\lag-llama-main\\lightning_logs\\version_12\\checkpoints\\`epoch=34-step=5678.ckpt`'
(this is displayed on the console)
…
-
Why repeat here? The results generated after repeat (i.e. the parameters of the distribution) should be the same, right? What is the significance of this?
https://github.com/time-series-foundation-…