Closed Holyniwa closed 9 months ago
It is, but in dev branch and will be merged to master in the next release. Feel free to checkout dev branch if you want to try it out.
was this implemented? not sure how to update to a dev branch. (i am very github noob)
It's in the last release, don't need dev branch.
Oh? I updated it to latest release, but it's still not working? Do you have some documentation of your weight blocks or are they supposed to work as they did with the extension? (which I've now disabled) I'm getting the error messages:
│ D:\stable-diffusion\automaticNEW\modules\extra_networks.py:75 in activate │ │ │ │ 74 │ │ try: │ │ > 75 │ │ │ extra_network.activate(p, extra_network_args) │ │ 76 │ │ except Exception as e: │ │ │ │ D:\stable-diffusion\automaticNEW\extensions-builtin\Lora\extra_networks_lora.py:38 in activate │ │ │ │ 37 │ │ │ te_multiplier = float(params.named.get("te", te_multiplier)) │ │ > 38 │ │ │ unet_multiplier = [float(params.positional[2]) if len(params.positional) > 2 │ │ 39 │ │ │ unet_multiplier = [float(params.named.get("unet", unet_multiplier[0]))] * 3 │ └─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘ ValueError: could not convert string to float: '0,0.1,0.1,0.1,0.1,0.1,0.1,1,1,1,1,1,1,1,1,1,1'
Full docs need to be written, but you'll see a short example in changelog.
Ahh, i see it now. So rather than <lora:epiNoiseoffset_v2:1.75:0,0.1,0.1,0.1,0.1,0.1,0.1,1,1,1,1,1,1,1,1,1,1>
now its
We may add individual blocks if there is demand.
Right. Dang, I really do hope so 🙏 . I noticed that due to in/mid/out, base is missing as the first block, and 9-11 function extensively different from 12-17. (also, I'm not really an expert on it either, i just had months of testing with them. They're sorta complex, which is why i understand why the community doesn't use them to great details) i also forgot to ask, does this work for Lycoris?
Update: I'm experimenting with it a bit more, and it looks like it may actually have individual weight block support after all. I think it's structured as:
looping in @AI-Casanova as he authored the recent block weight implementation.
What the others call BASE is 99% just text encoder layers (plus 2 other hard to classify keys)
TE LoRA (if those keys exist) can be accessed by splitting the numerical inputs like this.
Thanks Vlad!
Hey Casanova, first off, thank you so much for adding this feature! It's easily my most used feature in the older versions, and I've been dying to see it implemented (or supported) in the newer updates.
Interesting information regarding BASE. A lot of the information involving block weights still goes a bit over my head, but I appreciate the knowledge!
Regarding this implementation's usage, I tried looking up and down the changelogs, and honestly, I just couldn't find a detailed example.
I know some people used INS, IND, INALL, MIDD, OUTD, OUTS, OUTALL in the past with a different extension, not sure if those are relevant for the structure. (i personally liked having the weight blocks in the prompt rather than in a separate tab) Honestly, anything really helps, lol. Again, thank you!
Feature description
I still can't believe that weight blocks isn't a part of SDN... For some reason, their extension (https://github.com/hako-mikan/sd-webui-lora-block-weight) hasn't been working for the last 2 months, since is why I haven't bothered updating, but with LCMs becoming more popular, i really want to get on the newest version.
Before the "it is, your just doing it wrong", no it isn't. I keep seeing messages like ValueError: could not convert string to float: '2,2.5,2.5,2.5,2.5,2.5,2.5,0.5,0.5,0.5,0.5,0.5,0.8,0.8,0.8,0.8,0.8' on the first Lora/Lycoris it hits, and then refuses to load any lora/lycoris following the error.
Version Platform Description
No response