yuqinie98 / PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Apache License 2.0
1.37k stars 248 forks source link

Mulit-GPU #24

Closed ikvision closed 1 year ago

ikvision commented 1 year ago

Is there an option to run the training on multiple GPU (single node)? I would like to make the training faster by (effectively larger batch size)?

yuqinie98 commented 1 year ago

Yes! You could find "--use_multi_gpu" and "--devices" in run_longExp.py. You just need to set those parameters for multi-gpu usage.

ikvision commented 1 year ago

@yuqinie98 Thanks the multi-gpu works perfect for me now

dawn0713 commented 1 year ago

How to use multi-gpu in patchtst_pretrain.py? Thanks