Closed diodiogod closed 7 months ago
Yes. The values for BASE and Te are the same. This means that the results you tested should originally be the same. In fact, the results I tested were the same for the following three outcomes:
Generated using lbw for `<lora:NAME:0.5:1:lbw=1,1,1,1,1,1,1,1,1,1,1,1>`
Generated using lbw for `<lora:NAME:1:1:lbw=0.5,1,1,1,1,1,1,1,1,1,1,1>`
Generated by merged model using ratio `lora:NAME:1:0.5,1,1,1,1,1,1,1,1,1,1,1`
This could be due to differences in the environment you are using. The handling of LoRA can vary significantly depending on the version of the web-ui and its settings, etc.
Oh my... I've just tested on my forge installation and you are correct. <lora:NAME:1000:1:lbw=0,1,1,1,1,1,1,1,1,1,1,1>
did not make noise.
BUT I kept testing and now I think it's about SDXL. On Sd15 you are correct <lora:NAME:99:1:lbw=0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1>
wont generate noise. It acts as Te:0
But with SDXL loras <lora:SDXLloraNAME:99:1:lbw=0,1,1,1,1,1,1,1,1,1,1,1>
will generate noise. Would you try it? It happens both on Forge and on my automatic1111.
Anyway, thanks for the response. You extension is probably the most needed and amazing of all SD ones I use!
Ah, you were right. Looks like the thing you mentioned did happen with XL. When I checked it out, it seems like it hadn't dealt with the second text encoder that was added to XL. Seems like the problem got missed initially because the XL LoRA's Text Encoder wasn't trained at the start. It's been sorted now. Thanks.
I'm really grateful for this extension and supermerger. Analyzing my SDXL Lora I get the BEST results reducing the effect of the trained lora text encoder and chopping some Unet blocks. With these settings:
<lora:NAME:0.25:1:lbw=0.25,0,0,0,0,0,1,1,1,0,0.15,1>
I get really great results. I really wish I could merge my lora to fix or adjust the Text Encoder value TE and Unet value to these ones above. But I don't know if it is possible.From your other responses here https://github.com/hako-mikan/sd-webui-supermerger/issues/281#issuecomment-1805428022 https://github.com/hako-mikan/sd-webui-lora-block-weight/issues/110#issuecomment-1708282404
You said that the BASE value is the same as this FIRST TE value on the LoRA. As if in