rgthree / rgthree-comfy

Making ComfyUI more comfortable!
MIT License
1.02k stars 77 forks source link

Somehow rgthree execution patch blow up when running this workflow. #118

Open Trung0246 opened 8 months ago

Trung0246 commented 8 months ago

Reference: https://github.com/Trung0246/ComfyUI-0246/issues/20

I also tried to run the workflow by myself. The cache looks weird:

image

It looks like exponential growth. Unsure how to move this forward.

rgthree commented 8 months ago

This is a huge workflow, and something that is probably giving ComfyUI some issues.

@Pojo267 - were you iterating on this? Was there a recent point this was working?

this could be a bug in the optimization patch, but my first guess is that the structure of highway nodes and the size of this workflow here is the first problem. Regardless, the first step would be to turn off the optimization patch via rgthree config:

  1. The config is a json file in the root of the rgthree custom nodes. You can create a file (if it doesn't exist) as [comfyui]/custom_nodes/rgthree-comfy/rgthree_config.json.
  2. At the very least, the file should have this to turn off the optimization:
    {
      "patch_recursive_execution": false
    }
  3. When you restart ComfyUI, you should only see the first message below, and NOT the second message:
    [rgthree] Loaded 20 exciting nodes.
    [rgthree] Optimizing ComfyUI recursive execution. If queueing and/or re-queueing
        seems broken, change "patch_recursive_execution" to false in rgthree_config.json

Now, for both @Trung0246 and @Pojo267 here's why this workflow may cause problems...

ComfyUI's recursive execution is incredibly inefficient. It doesn't matter so much for standard workflows, but once nodes with many inputs/outputs are used it becomes unusable; 15 minutes spent for an actual 2 minute model execution just to recursively evaluate the graph. The optimization I've added should make this 1000's of times faster by caching results (which, since the graph isn't changing should be stable).

This is especially true when using nodes with many inputs and outputs, like the rgthree Context nodes or, here, the @Trung0246 highway nodes. Each node highway node here is creating multiple branches, which are then added to possible paths to get to that final Save node. So, when comfy's asking "how many possible paths does it take to get to this Save node" the answer is incredibly large.

So, first step would be turning off the optimization in the rgthree_config.json and seeing what happens. If it executes, even if you have to wait many, many minutes, then that's good; maybe there's a change in the optimization I can make.

But, if you still get an overflow, then there's not much more to do without completely rewriting ComfyUI's graph evaluation (the rgthree patch is an outward-in fix).

(Note, it's possible the terminal will not show any change for many, many minutes. Just hit queue and walk away; you'll be tempted to cancel the execution if you're waiting (as I was, many times, before digging into the execution).)

I can't get the workflow running myself, but let me know how it goes. I'm interested to see what happens here.

rgthree commented 8 months ago

For posterity, here's the initial issue I opened on the execution slowness: https://github.com/comfyanonymous/ComfyUI/issues/1502