Open dingdongwang opened 9 months ago
Hi, I have a question about LTU-AS multi-GPU training, may I kindly ask if this repo support multiple GPU training? Since I didn't saw related configures (e.g. accelerate, deepspeed).
Thank you again and looking forward to your reply!
All codes by default use multiple GPUs. HF handles that.
https://github.com/YuanGongND/ltu/blob/4589490e23f4fc5cb970b22a98a123688bbaa419/src/ltu_as/train_scripts/finetune_toy.sh#L18
https://github.com/YuanGongND/ltu/blob/4589490e23f4fc5cb970b22a98a123688bbaa419/src/ltu_as/finetune.py#L127
https://github.com/YuanGongND/ltu/blob/4589490e23f4fc5cb970b22a98a123688bbaa419/src/ltu_as/finetune.py#L107-L110
-Yuan
Hi, I have a question about LTU-AS multi-GPU training, may I kindly ask if this repo support multiple GPU training? Since I didn't saw related configures (e.g. accelerate, deepspeed).
Thank you again and looking forward to your reply!