cheald / sd-webui-loractl

An Automatic1111 extension for dynamically controlling the weights of LoRAs during image generation
MIT License
239 stars 10 forks source link

LoRa Block Weight Compatibility #1

Closed Neverdusk closed 1 year ago

Neverdusk commented 1 year ago

This extension seems amazing, but I received an error when running it alongside LoRa Block Weight (https://github.com/hako-mikan/sd-webui-lora-block-weight).

Here's the actual error message: activating extra network lora with arguments [<modules.extra_networks.ExtraNetworkParams object at 0x000001DF18F48FD0>, <modules.extra_networks.ExtraNetworkParams object at 0x000001DF18F491B0>]: ValueError Traceback (most recent call last): File "C:\Users\silve\Documents\AI Art__SDUX\modules\extra_networks.py", line 92, in activate extra_network.activate(p, extra_network_args) File "C:\Users\silve\Documents\AI Art__SDUX\extensions-builtin\Lora\extra_networks_lora.py", line 22, in activate multipliers.append(float(params.items[1]) if len(params.items) > 1 else 1.0) ValueError: could not convert string to float: '0@0,0.5@0.5'

LoRA Block weight (lora): NeverStyleV2A: 0.5 x [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] [0.0, 0, 0.0, 0.0, 0, 0.0, 0.0, 0, 0.0, 0.0, 0, 0, 0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] Error running process_batch: C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py Traceback (most recent call last): File "C:\Users\silve\Documents\AI Art__SDUX\modules\scripts.py", line 491, in process_batch script.process_batch(p, script_args, kwargs) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 305, in process_batch loradealer(self, o_prompts ,self.lratios,self.elementals) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 540, in loradealer multiple = float(called.items[1]) ValueError: could not convert string to float: '0@0,0.5@0.5'

And for reference, here's how I typed in the LoRa: lora:RoxanneStyle:0@0,0.5@0.5:NONE,

Unless I did something wrong, the two may be incompatible. If it's at all possible, making this compatible with LoRa Block Weight would be amazing, and allow incredible amounts of LoRa control. Especially if you could change the weights between steps, though that would just be a nice bonus.

cheald commented 1 year ago

I'll see if I can test it out; it might be that lora block weight is doing something with lora processing, too, so it might be fighting the extra network processor.

blooest commented 1 year ago

This error is on LBW's side. Without named args, for the lora handler, they expect "lora:name:weight:LBW", which won't even support the now-built-in TE/UNET/DDIM split. You have to use the named arg "lbw=LBW". That being said, it theoretically should work if you just change that in your prompt. I haven't updated to 1.5 yet to test it for myself, although this extension is certainly tempting me....

EDIT: As for adjusting the block weighting at steps, you could load the network multiple times and kick in the weights you want, when you want them, that way. It's a bit of a workaround, but unless cheald took it upon themself to just integrate LBW into this extension, or hako-mikan entirely rewrites LBW, I doubt direct compatibility is really feasible. That extension is super hacky at present.

Neverdusk commented 1 year ago

Huh. I didn't know "lbw" could apply to non-LyCORIS Lora.

So an example of the correct syntax would be lora:LoraName:0.5@0,1@1:lbw=INALL? Or am I getting something wrong?

blooest commented 1 year ago

Should be, yes.

As to the first part -- it didn't, before their kludgy update to 1.5. Now, if the "networks" module is detected, as opposed to "lora", as is the case in 1.5, it just passes everything through to the "lycodealer", which is a function that singularly parses for lbw= args.

cheald commented 1 year ago

It's worth noting that the 1.5 extra networks parser will parse out and make available arbitrary named arguments; that's how I'm adding support for the hr/hrunet/hrte arguments. Hopefully other extensions which want to add to the lora network syntax will start consuming those, as well, rather than doing their own parsing.

blooest commented 1 year ago

Well.. I updated and tested, and, nope, the LBW parser freaks out if the first arg is a nonfloat, too. Which is to say, it's entirely incompatible with this (and the base te= named argument). Nothing cheald can do about it from this side.

@cheald I don't really understand the inner workings of this enough to give a proper explanation, if you have the time could you add some helpful context for hako-mikan in https://github.com/hako-mikan/sd-webui-lora-block-weight/issues/82 ?

Neverdusk commented 1 year ago

After trying again, I still seem to be getting that error because of LBW. At least when doing the X@X,Y@Y format. I even tried te=X@X,Y@Y, and I still get the "could not convert string to float" error.


                                                                                                                   activating extra network lora with arguments [<modules.extra_networks.ExtraNetworkParams object at 0x0000023592D15030>, <modules.extra_networks.ExtraNetworkParams object at 0x0000023592D179A0>, <modules.extra_networks.ExtraNetworkParams object at 0x0000023592D178E0>, <modules.extra_networks.ExtraNetworkParams object at 0x0000023592D15FF0>]: ValueError

Traceback (most recent call last): File "C:\Users\silve\Documents\AI Art__SDUX\modules\extra_networks.py", line 104, in activate extra_network.activate(p, extra_network_args) File "C:\Users\silve\Documents\AI Art__SDUX\extensions-builtin\Lora\extra_networks_lora.py", line 26, in activate te_multiplier = float(params.named.get("te", te_multiplier)) ValueError: could not convert string to float: '0@0,0.5@0.5'

Error running process_batch: C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py Traceback (most recent call last): File "C:\Users\silve\Documents\AI Art__SDUX\modules\scripts.py", line 544, in process_batch script.process_batch(p, script_args, kwargs) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 306, in process_batch loradealer(self, o_prompts ,self.lratios,self.elementals) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 541, in loradealer multiple = float(called.items[1]) ValueError: could not convert string to float: 'te=0@0,0.5@0.5'

However, the hr=X syntax seems to work fine, even with LBW.

Edit: Nevermind, I received a different error when I tried hr=X. At this point I'm afraid whatever's happening is a bit over my head, but I assume it's as you said and LBW is doing something a bit odd.

Here's the syntax I used:


                                                                                                                   LoRA Block weight (lora): RoxanneStyle: 0.5 x [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

[0.0, 0, 0.0, 0.0, 0, 0.0, 0.0, 0, 0.0, 0.0, 0, 0, 0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] LoRA Block weight (lora): Weight: 3.0 x [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] [1.0, 0, 1.0, 1.0, 0, 1.0, 1.0, 0, 1.0, 1.0, 0, 0, 0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] Error running process_batch: C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py Traceback (most recent call last): File "C:\Users\silve\Documents\AI Art__SDUX\modules\scripts.py", line 544, in process_batch script.process_batch(p, script_args, kwargs) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 306, in process_batch loradealer(self, o_prompts ,self.lratios,self.elementals) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 571, in loradealer if len(lorars) > 0: load_loras_blocks(self, lorans,lorars,multipliers,elements,ltype) File "C:\Users\silve\Documents\AI Art__SDUX\extensions\sd-webui-lora-block-weight\scripts\lora_block_weight.py", line 597, in load_loras_blocks lbw(lora.loaded_loras[l],lwei[n],elements[n]) IndexError: list index out of range


blooest commented 1 year ago

Yup. LBW, in addition to its own weight argument, parses the first argument, without exception, as a float to use as an overall multiplier. Which is fair enough for their purposes, and worked just fine in 1.4 and earlier, but it does mean that it's fully incompatible with adjusting the text encoder with step control here (and using te= to name the argument, natively, in 1.5+, too).

EDIT to match your edit: It failed to parse it, just, generally. Not even the float failure, their parser didn't know how to make sense of it.

In both cases, again, entirely on their end, they need to rewrite their code to be more versatile in light of the new networks implementation.

EDIT 2: This has been partially fixed. It'll read the first arg if it's unnamed, or the unet arg if that is. It still has a number of edge cases, and it's still incompatible with adjusting the unet with step control, but it's a start. Your second test of using the lora only during hiresfix seemed to work without errors for me, although I would suggest providing lbw=ALL so it doesn't do anything unexpected.

cheald commented 1 year ago

I'm gonna close this ticket for the time being while we wait on the other extension to catch up. At the time that they're compatible again, I'll see about revisiting this.

AG-w commented 6 months ago

is there a hint or some rough explanation on how do I make LBW extension compatible? is problem only on parser or even deeper?