Open avan06 opened 3 weeks ago
Today, I tested several parameter settings again and found that whenever "train_blocks": "single"
is set, adding --network_args "loraplus_unet_lr_ratio=4"
also triggers the error message: AssertionError: train_blocks must be single for split mode
.
Hi,
Today, when I was running LoRA training for the
Flux.1
model (sd-scripts on SD3's breach), the "train_blocks must be single for split mode
" error suddenly occurred. This error had not appeared before. After reviewing the parameter settings, I finally found the cause.The issue was that I specified both the "
network_weights
" and "dim_from_weights
" parameters. Once I disabled the "dim_from_weights
" parameter, everything worked fine again.I wonder if anyone else has encountered the same issue. Could it be that
dim_from_weights
retrieves double blocks, causing the split mode mechanism to malfunction?