thuml / Autoformer

About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
MIT License
2k stars 429 forks source link

Training wall clock time #180

Closed norikazu99 closed 1 year ago

norikazu99 commented 1 year ago

Hello, thanks a lot for the repo and collection of models, I really appreciate it it's been very helpful. However, when I implement one of the models (informer to be more specific), one epoch seems to take around 30min with mixed precision and reducing the size of the model using a pytorch lightning trainer. Whereas other models I usually train 20 epochs in around 5 to 30minutes depending on size. Would you have some insights as to how I could make modifications to the model or how to increase speed? Is it due to the factor parameter? Once again, thanks for your help.