-
Thank you very much for the quick implementation.
Looking at the [original paper](https://arxiv.org/abs/2310.06625) and the [author implementation](https://github.com/lucidrains/iTransformer) on iTra…
-
I have a problem running an object detection model in cpu mode:
var mlContext = new MLContext();
mlContext.GpuDeviceId = null;
mlContext.FallbackToCpu = true;
…
-
bash ./scripts/multivariate_forecast/Traffic/iTransformer.sh
-
Suppose you have a base `ITransformer` class with a subclass for each type:
`public abstract class BaseTransformer implements ITransformer`
├─ `public class ClassTransformer extends BaseTransforme…
-
Your work is very beautiful. Running the dataset corresponding to the DEMO you provided is very simple, but I don't know how to run a custom dataset (such as household energy consumption data, where e…
-
Thanks for sharing. This is an awesome work!
In my opinion, compared to existing work, iTransformer essentially uses an MLP to handle sequential dependencies and then uses attention scores to analy…
-
Hi, thanks a lot for your great code and efforts!
Would you like to add the official implementation link to https://github.com/thuml/iTransformer
-
Koopa在Long-term Forecasting任务上的表现能否上榜?README没有更新排名,是说明Koopa的表现不如TimesNet和iTransformer吗?这和Koopa原论文中的结果不相符?
-
Hi, I found that the performance of the PatchTST on the **traffic** dataset is significantly different from the original paper. In the original paper, the MSE Loss results of PatchTST on the traffic d…
-
It seems that those nn.Conv1d() in the TokenEmbedding are performing convolutions across the time axis, which causes a leakage of the future information.