Open wjx008 opened 1 year ago
@wjx008 those arguments are not present in this fork of diffusers. Those arguments were added to a later version in the original huggingface/diffusers repo. The train_dreambooth.py script in this forked repo enables xformers automatically with the following code in train_dreambooth.py
:
if is_xformers_available():
pipeline.enable_xformers_memory_efficient_attention()
Describe the bug
using the train_dreambooth.py script, when I add flags for enabling xformers and set_grads_to_none, the following error happened: train_dreambooth.py: error: unrecognized arguments: --enable_xformers_memory_efficient_attention --set_grads_to_none
Reproduction
Followed the instructions in [https://github.com/ShivamShrirao/diffusers/tree/main/examples/dreambooth](dreambooth example readme)
pip install git+https://github.com/ShivamShrirao/diffusers.git
pip install -U -r requirements.txt
and installed bitsandbytes withpip install bitsandbytes
and installed xformer from sourcepip install ninja
pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers
Thenaccelerate config
Then followed the steps for 12GB GPU, set all the variables and executed: `accelerate launch train_dreambooth.py \--pretrained_model_name_or_path=$MODEL_NAME \ --instance_data_dir=$INSTANCE_DIR \ --class_data_dir=$CLASS_DIR \ --output_dir=$OUTPUT_DIR \ --with_prior_preservation --prior_loss_weight=1.0 \ --instance_prompt="a photo of sks dog" \ --class_prompt="a photo of dog" \ --resolution=512 \ --train_batch_size=1 \ --gradient_accumulation_steps=1 --gradient_checkpointing \ --use_8bit_adam \ --enable_xformers_memory_efficient_attention \ --set_grads_to_none \ --learning_rate=2e-6 \ --lr_scheduler="constant" \ --lr_warmup_steps=0 \ --num_class_images=200 \ --max_train_steps=800`
Logs
System Info
diffusers
version: 0.15.0.dev0accelerate config
)