mcmonkeyprojects / SwarmUI

SwarmUI (formerly StableSwarmUI), A Modular Stable Diffusion Web-User-Interface, with an emphasis on making powertools easily accessible, high performance, and extensibility.
MIT License
1.14k stars 83 forks source link

Missing LoRAs causing issues when reusing parameters #41

Closed Lawrr closed 2 months ago

Lawrr commented 3 months ago

Expected Behavior

  1. Number of loraweights matches the number of loras applied to the generated image
  2. Lora weight is updated correctly when "Reuse parameters" is pressed
  3. Lora weight in the UI matches the weight used in the image generation

Actual Behavior

When "Reuse Parameters" is pressed with a missing lora:

And then when an image is generated afterwards:

Steps to Reproduce

  1. Generate an image that uses 2 loras. To demonstrate this clearer, set them to different weights (and don't use 1.0). Lets say we pick 0.1 for lora A and 0.9 for lora B.
  2. Rename or move lora A so it is "missing"
  3. Refresh lora list and then refresh the browser
  4. Notice lora A is now gone from the "current loras" list and the weight of lora B has been reset to 1.0 in the UI.
  5. Select the image you just generated and click "reuse parameters"
  6. Notice that in "current loras" that:
    • lora A from the original generation is not there (this is expected because it has been moved and is now "missing")
    • lora B from the original generation is there but the weight is not updated (still shows 1.0 instead of 0.9).
  7. Generate another image
  8. Notice the following issues:
    • The image metadata contains 1 lora (lora B) but still the 2 loraweights from the original generation
    • The weight of lora B that was applied to the image was neither 1.0 (as seen in the UI), nor 0.9 (from original generation), but 0.1.

Debug Logs

Nothing relevant in the server logs, but browser console contains:

Ignoring invalid LoRA weights value. Have 1 LoRAs (["<lora name>"]), but 2 weights (0.7,0.8)

Other

Not directly related but when reusing parameters and there is a missing lora, currently it just omits it from the current loras. It might nice to have it still in the "current loras" but faded out. This would make it easier to identify that a lora is missing and be a clearer indicator to the user that the reused parameters don't exactly match the image.

Or, just raise a pop up warning.

mcmonkey4eva commented 3 months ago

Fixed - now will render the stray lora as added and apply everything correctly, then error when you hit generate

Lawrr commented 3 months ago

Appreciate the quick fix, and I like the idea of the error message on generate. It might be useful to include the exact name of the missing value in the error message though, rather than just the entire list value.

Also still seeing one issue: the lora list appears to be ordered alphabetically, and if the missing lora is the first in the list (alphabetically), when "Reuse Parameters" is pressed, the missing lora gets moved to the end of the "current loras" list, and it seems like as a result, the lora weights are getting switched around so lora A is getting weight B (confirmed this isn't just a graphical bug, the comfy workflow from "import from generate tab" have the weights swapped around).

mcmonkey4eva commented 3 months ago

... How did you get them out-of-order in the first place?

Lawrr commented 2 months ago
  1. Add loras b and c 1
  2. Generate image
  3. Rename b.safetensors to b1.safetensors so that it is "missing"
  4. Refresh loras from UI
  5. Refresh browser
  6. b and c are now swapped after the browser is refreshed 2

Note: this non-alphabetical order remains even if you manually remove the loras and click "reuse parameters".

mcmonkey4eva commented 2 months ago

the above commit fixes that. Prooobably should just replace with special handling rather than a hack to autocorrect the order, but, eh, it works for now