Closed elephaint closed 5 months ago
Check out this pull request on
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
We are missing adding TiDE to the automatic evaluation in test-model-performance.
This should be added in the action_files
folder, concretely: https://github.com/Nixtla/neuralforecast/blob/main/action_files/test_models/src/models.py and https://github.com/Nixtla/neuralforecast/blob/main/action_files/test_models/src/evaluation.py
We are missing adding TiDE to the automatic evaluation in test-model-performance.
This should be added in the
action_files
folder, concretely: https://github.com/Nixtla/neuralforecast/blob/main/action_files/test_models/src/models.py and https://github.com/Nixtla/neuralforecast/blob/main/action_files/test_models/src/evaluation.py
@cchallu A lot of models are missing those scripts, and most models seem to be unused in models.py. Shouldn't all the models be included in the models list in models.py and evaluation.py? (because then I will make that change too)
Sorry, I forgot to mention. The idea is to have all univariate models in this evaluation, and we should include them at the moment of creating the model to assess their performance and detect potential bugs.
We should then have a separate script for multivariate models on a different dataset tailored for multivariate forecasting, like Ettm2.
B
Sorry, I forgot to mention. The idea is to have all univariate models in this evaluation, and we should include them at the moment of creating the model to assess their performance and detect potential bugs.
We should then have a separate script for multivariate models on a different dataset tailored for multivariate forecasting, like Ettm2.
B
Ok, added BiTCN and TiDE to the script, so it should be ok for this PR then.
Adds TiDE model.
Note:
scaler_type='revin'
).