cheald / sd-webui-loractl

An Automatic1111 extension for dynamically controlling the weights of LoRAs during image generation
MIT License
239 stars 10 forks source link

ValueError: could not convert string to float #6

Closed Luke-L closed 1 year ago

Luke-L commented 1 year ago
Traceback (most recent call last):
  File "B:\Github\stable-diffusion-webui-latest\modules\extra_networks.py", line 104, in activate
    extra_network.activate(p, extra_network_args)
  File "B:\Github\stable-diffusion-webui-latest\extensions-builtin\Lora\extra_networks_lora.py", line 25, in activate
    te_multiplier = float(params.positional[1]) if len(params.positional) > 1 else 1.0
ValueError: could not convert string to float:0.0@0.6,0.8@0.7'

The LORA part of my prompt is <lora:Piercing01:0.0@0.6,0.8@0.7>. I cannot figure out what is wrong with the prompt that is causing this error. Any ideas or tips? I've tried with the 'Plot the LoRA weight in all steps' box checked and unchecked, and I've read the wiki page too.

cheald commented 1 year ago

Are you sure the extension is enabled? That error message is coming from the default network handler, not the loractl extension. The default handler can't parse non-numbers (such as those with the @ sign).

What version of the webui are you using, and are you using any other extensions which modify LoRA behavior?

superprat commented 1 year ago

Getting the same error. Mine was related to the composable-lora extension. Even though the extension is disabled.

Deleting the extension fixed the issue.

EvansJahja commented 1 year ago

Came here to say I have the same issue.

I don't have any other extension/script beside this and Regional Prompter (which is off).

Edit: Removed Regional Prompter and it still doesn't work.

EvansJahja commented 1 year ago

I think I figure out the issue.

You have a class LoraCtlNetwork defined in lora_ctl_network.py

And also script_callbacks.on_before_ui(before_ui)

which will register LoraCtlNetwork under the name "lora", intentionally replacing the original "lora".

However, because the original lora is also defined using the same on_before_ui callback, in some people it may load later, causing your "lora" to be loaded before the original lora has been loaded.

I may be able to raise a proper PR later on, but from my testing, the short-term solution is to edit lora_ctl_network.py

Changing the bottom of this file:

script_callbacks.on_before_ui(before_ui)

to

script_callbacks.on_model_loaded(before_ui)

solves the issue for me.

cheald commented 1 year ago

Great catch. I'll make that change now.

cheald commented 1 year ago

fb499ddd5ca55fbcf40413c2115206dcf20424ed should fix it up. Please let me know if the issue recurs.