blepping / comfyui_jankhidiffusion

Janky implementation of HiDiffusion for ComfyUI
Apache License 2.0
110 stars 7 forks source link

Refactor #18

Closed blepping closed 1 month ago

blepping commented 1 month ago

Refactor RAUNet code to avoid monkeypatching Upsample/Downsample blocks (by pamparamm)

Move two_stage_upscale toggle into two_stage_upscale_mode (by pamparamm)

Refactor RAUNet code to avoid monkeypatching forward_timestep_embed

Allow setting a downscale factor and mode for CA downsampling in advanced RAUNet node

Make it so MSW-MSA attention failing due to size mismatches is a warning rather than hard error

blepping commented 1 month ago

@pamparamm sorry, i ended up taking your pull and making it my pull. i originally intended to just do some reformatting/lint squashing but i got carried away and made a bunch of other changes.

i'm not quite done with this stuff but i figured i would ping you just to see if you had any feedback/criticism for the additional changes. if so, i'll definitely take it into account.

i also added you to the credits section since your contribution was so significant.

pamparamm commented 1 month ago

LGTM

blepping commented 1 month ago

LGTM

unfortunately i found a serious issue that's in your original version as well: the changes don't only apply to the cloned model, so they persist if the RAUNet settings are changed and each time you do this it adds a layer of passing through the forward functions.

for example, if you add a print to the forward up/forward down functions it's easy to verify. can reproduced pretty easily:

  1. create a RAUNet node and set the time to something like start 0, end 0.1 (the issue occurs when falling back to the "original" forward function)
  2. start running the workflow
  3. cancel it
  4. make any change that forces the RAUnet node to reapply (like just adding whitespace to the block list) then start again
  5. optionally repeat a few times

you'll end up with output like this each model call (while if it was working correctly you'd just see one call into the custom forward functions):

DOWN: 4.205848693847656 False
DOWN: 2.819605827331543 False
DOWN: 2.3260841369628906 False
DOWN: 1.141690731048584 False
UP: 4.205848693847656 False
UP: 2.819605827331543 False
UP: 2.3260841369628906 False
UP: 1.141690731048584 False

also can use hasattr to check if block_name already exists on model.model before the setattr: the second time around, it will already be there.

i'm not really sure how to deal with this. i actually don't really understand how adding an object patch called hd_input_4 ends up replacing the model input 4 block.

blepping commented 1 month ago

this may have fixed it. i think the issue actually might have been benign-ish in your version because you would get an updated time. it caused serious issues in my version because the time in hdconfig for that version of forward wouldn't get updated anymore so the effect could get stuck in an enabled state.

i'm not sure this approach is correct but it seems to work and existing model patches still seem to have an effect. will have to test more.