Open Aisper opened 1 year ago
What about running batch sizes > 1 triggers this error in the first place? Does cur_num_prompts
get set to an incorrect value? This commit feels like it handles an error rather than fixing it, but I'm not familiar enough with the code to tell for sure.
If this is a true fix, I would say it would be clearer to break out of the loop once the condition is met.
for off in range(cur_num_prompts):
if base + off >= prompt_len:
break
loras = prompt_loras[base + off]
multiplier = loras.get(lora.name, 0.0)
if multiplier != 0.0:
# print(f"c #{base + off} lora.name={lora.name} mul={multiplier}", lora_layer_name=lora_layer_name)
res[off] += multiplier * alpha * patch[off]
I couldn't attach to a process to investigate properly, as vs code and jetbrains tools both don't want to support python 10 attaching. But I logged some variables. cur_num_prompts sets to a correct value, but on the last step off gets big enough to overflow it. Seems like it was just missed when author of the code wrote that part.
About breaking - yeah you are right it seems like that would be clearer.
Does this extension actually work for you? For me, enabling it (with or without this patch) just results in incredibly poor results when used with Latent Couple. I see both Loras in each region (even with 0,1,1
weights) and it's like the Lora is being applied twice as the image is very grainy like an overtrained model. If only one subprompt has a Lora, the side without will be blurry as though it has not had some steps applied to it..
It just feels like this extension doesn't work, at least not the way I'm using it.
I haven't tried using it with Latent Couple, but it seems to work. I'm currently training a model to replicate style of one of our in-house artistss (i'm working in gamedev) and training on 2.1. The thing is that the checkpoint i'm testing on just refuses to give good results with my trained LoRAs. But enabling composable lora fixes it and the style becomes clearly visible. I cannot provide examples unfortunately, as they are companies properties, and I kinda don't know all the process and maybe it does not work as intended, but I get better results with it enabled.
Again, did not try it with Latent Couple yet.
added a check to prevent base + off going out of range fixes https://github.com/opparco/stable-diffusion-webui-composable-lora/issues/10