anhnh2002 / XTTSv2-Finetuning-for-New-Languages

60 stars 17 forks source link

multigpu training #11

Open thivux opened 1 month ago

thivux commented 1 month ago

thanks a bunch for sharing this awesome work – it's super helpful! 😄

quick question: is it possible to use the training scripts for dvae and gpt with a multi-gpu setup?

keep up the great work, and thanks for being so responsive to everyone's questions! 🙏

anhnh2002 commented 1 month ago

thanks a bunch for sharing this awesome work – it's super helpful! 😄

quick question: is it possible to use the training scripts for dvae and gpt with a multi-gpu setup?

keep up the great work, and thanks for being so responsive to everyone's questions! 🙏

Sorry for delayed reply. Currently, the training scripts are designed for single-GPU use. However, i'm considering implementing multi-GPU support in a future update.

TugdualKerjan commented 4 weeks ago

Using python -m trainer.distribute --gpus 0,1,2,... (whatever amount you have) --script train_xtts_gpt.py worked for me - beware that you have to deactivate the HFargparser in the train_xtts_gpt.py and instead pass the arguments as hardcoded vars in the script instead