TUDB-Labs / MoE-PEFT

An Efficient LLM Fine-Tuning Factory Optimized for MoE PEFT
Apache License 2.0
35 stars 4 forks source link

Support for Multi-GPU Training #3

Open kekethu opened 2 months ago

kekethu commented 2 months ago

Great work! But I've noticed that the current implementation seems to only support single-GPU training. Is that correct? If so, do you have any plans to extend support for multi-GPU training in the future? Looking forward to your response. Thanks!

mikecovlee commented 2 months ago

Yes we only supports single computing device (such as GPU or Accelerator). We have plan for integrating multi-devices support techniques such as FSDP from PyTorch and LoRAPP (from our m-LoRA).