Closed Dinxin closed 1 year ago
I have exactly the same problem is there a plan to add support for the multi-gpu setting? Because the size of the batch seems to be extremely important for quality when fine-tuning.
We are also interested in development of this feature
Is there an existing issue for this?
What happened?
I used the multi-GPU training function provided by
accelerator
library to reduce the training time of dreambooth.This is the content of default_config.yaml
Actually, I want to use two of four GPUs available (gpu_ids=2,3) to conduct the training.
Unfortunately, I ran into the following two problems:
The startup named
webui-user-cuda1.sh
is:Steps to reproduce the problem
Write the following content into
conf/default_config.yaml
:Run the following command:
accelerate launch --config_file conf/default_config.yaml launch.py --listen --gradio-auth fine-art:AIGConTaiji --enable-insecure-extension-access
List of extensions
dreambooth
Console logs
Additional information
No response