https://github.com/xinzzzhou/ScalableTransformer4HighDimensionMTSF/blob/bb4fe13adf8a141f748145200091212b09abda42/data_provider/data_loader.py#L220C1-L221C1
hi, I found a bug in the code for the DataSet. The original intention of this line of code was to keep the 0th dimension (sequence length) unchanged and swap the 1st dimension (k+1: target with top K channels) with the 2nd dimension (D: number of channels). However, the reshape operation doesn't swap the two dimensions, which leads to issues with seq_x and seq_y in the final DataSet. When I tried replacing reshape with swapaxes during my experiments, the result was even worse. Could you help me understand why this happens?
https://github.com/xinzzzhou/ScalableTransformer4HighDimensionMTSF/blob/bb4fe13adf8a141f748145200091212b09abda42/data_provider/data_loader.py#L220C1-L221C1 hi, I found a bug in the code for the DataSet. The original intention of this line of code was to keep the 0th dimension (sequence length) unchanged and swap the 1st dimension (k+1: target with top K channels) with the 2nd dimension (D: number of channels). However, the reshape operation doesn't swap the two dimensions, which leads to issues with seq_x and seq_y in the final DataSet. When I tried replacing reshape with swapaxes during my experiments, the result was even worse. Could you help me understand why this happens?