comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
55.89k stars 5.9k forks source link

[Feature Request] TRUE SAVE (saving the states of outputs). #5487

Open GPU-server opened 3 days ago

GPU-server commented 3 days ago

Feature Idea

when you run a workflow, then add an additional save image input then rerun it, comfy will naturally start running at the level of the newly added nodes (provided nothing has changed in the alreayd run workflow)

Problem is lets say my workflow produced some complicated "set of images" that are saved in multiple nodes. If I want to add an extra node and rerun, ok everything will run as expected.

But what if I closed comfy or the pc and I wanted to replace the images wherever they were before adding that extra node. I would not be able. EVEN IF the workflow was identical. I would need to run it again and obtain the same "set of images" obtained previously, to be able to "treat" them again with additional nodes.

What I want is a "true save" that will save the workflow WITH ITS OUTPUT. When I run it, ONLY the "loading model/vae/etc" nodes will run, but anything in between (upscale/ adetailed/encode/decode) is not run, instead the workflow REMEMMBERS the previous output it got from its previous run of the same workflow, thought.. the save.

I can load a workflow WITH ITS outputs and continue modify it, even after reseting comfy/pc, and I can continue working on that workflow and those outputs direclty (just need to wait for the modesl to be loaded again).

Is the idea clear?

Existing Solutions

No response

Other

No response

ltdrdata commented 2 days ago

While models and conditioning cannot be saved, latents can be saved. Based on this, it's possible to create workflows that skip steps like sampling or upscaling.

The capability to save all steps to disk would be an extremely large undertaking. In particular, this would require providing implementation for serialization not only in Comfy core but also in custom nodes, and it's practically impossible to require custom node developers to implement this.

Although it's not practical to cache everything, we need a compromise solution that can bypass costly operations like sampling.

GPU-server commented 1 day ago

I see. I think. Although I am not fluent in all the technicalities. I pushed the idea out there. If saving the latent can help skip some parts of the workflow , and it is not a struggle then we can for it, but if that requires some complicated calculations and serialization (I don't know the term yet), I guess I understand it cannot be done.

Anyway if it can be done, I expect it to work like this:

  1. Load a workflow with its "outputs" and inputs (outputs can be used as inputs for extra nodes I will be using). For example my workflow produce 5 images as outputs, then I want to add a node that "merge" 3 out of the 5 images. I would simply load again my workflow with its 5 outputs and I will add a node to merge.
  2. Pressing queue will make comfy start from the outputs (if there is an extra node, other wise nothing happens)
  3. The first time you press queue, of course models are loaded.

So the workflow will SKIP all nodes in between "loading models" and the final output nodes.

If i decide delete one of the 3 images used in the new node (that was loaded with the workflow), pressing queue will go back to the workflow and and run the correspondign skipped nodes that are responsible for creating the missing image.