Open DarkAlchy opened 9 months ago
ValueError("optimizer got an empty parameter list")
Anyone attempting to use the SDXL script gets that the regular lora works.
I'm running into this while training with SD v1.5 and v2.1 as well, using the latest code from main
.
Looking into the error a little bit more, it looks like the list of parameters at https://github.com/p1atdev/LECO/blob/main/train_lora.py#L89 is empty. network.prepare_optimizer_params()
is returning an empty list because the list of module types in https://github.com/p1atdev/LECO/blob/main/lora.py#L190 filters everything out.
I'm not sure why those modules are all being filtered, but that leaves an empty LoRA, which the optimizer doesn't like very much.
Tried and some are getting a different error than I am, but the train_lora_XL.py script is broken. My error is that the optimizer args is empty.