Closed arwooy closed 4 months ago
Hello.
Your statement is correct. This part of the code logic is intended to serve our experiment on the generalization of the iTransformer model. We used a fixed 20% variable throughout the training phase and conducted inference on all variables. The experiment showed that the performance degradation in this case was very small, proving that our model has strong generalization between variables.
Our paper involves another efficient training experiment, in which we randomly select 20% of the variables during the training process.
The following code snippet is from exp_long_term_forecasting_partial.py:
Variate Generalization training:
Here, are the following two lines redundant? batch_x = batch_x[:, :, partial_start:partial_end] batch_y = batch_y[:, :, partial_start:partial_end]
If every time it's only partial_start:partial_end, doesn't that mean only these variables will be trained?