Open ETTAN93 opened 1 month ago
As implemented in DARTS the TiDE model will use input_chunk_length
for both lags
and lags_past_covariates
as defined in the regression models and will correspondingly use output_chunk_length
for lags_future_covariates
. Nearly all of the Torch forecasting models work this way, so if this is functionality you need you will need to write your own model.
Thanks. Is there a specific reason why it was implemented that way? Or is it just how the model architecture works?
I'm not a developer of the library myself, but it is near-ubiquitous for neural network-based forecasting models to define a single lookback for past target values and covariates, and a single horizon for targets and future covariates. This is how TiDE was initially reported and is how all of the torch-based forecasting models in DARTS are implemented. In addition to simplicity of it, it allows the assumptions that the future covariates can be aligned with the output time dimension and the past covariates can be aligned with the past target values.
That being said, you could definitely adapt TiDE to meet your needs as it does not really depend on these assumptions, though implementing those changes may not be trivial as you would need to add a new dataset type and corresponding torch_forecasting_model
type, as well as the model proper. Making the model architecture changes should be straightforward but I'm not sure how challenging the rest would be as I am not familiar with that part of the project.
If I initially had a LGBM model with the covariates defined as below:
How would I define a similar setup if I want to test out the TIDE model?
What would the
input_chunk_length
andoutput_chunk_length
be in this case to take into account the lags for both the past and future covariates? I assume theoutput chunk length
should still be 3?