kohya-ss / sd-scripts

Apache License 2.0
5.31k stars 880 forks source link

Multi-gpus(RTX3090-24GB) Flux finetuning #1791

Open wanglaofei opened 1 week ago

wanglaofei commented 1 week ago

Can the parameter "--blocks_to_swap" use in multi-gpus settings? Without "--blocks_to_swap", how to finetuning the Flux in multi-gpus with 24GB?

kohya-ss commented 6 days ago

Unfortunately, block swap doesn't seem to work on multi-GPUs. Multi-GPU training would require DeepSpeed ​​or FSDP, but I don't have time to work on that right now.

wanglaofei commented 5 days ago

Does it mean that flux finetuning is only suitable for one gpu right now? Hope the multi-gpu Flux training comes soon! Thanks for your time, it's a great work.