comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
49.72k stars 5.23k forks source link

Flush Comfy's memory and revert the VRAM usage from low to normal mode #3615

Open murphylanga opened 3 months ago

murphylanga commented 3 months ago

Is there a way to clear the memory (VRAM) after a workflow run? Or possibly even during the run? I am using a laptop RTX3070 graphics card. And I use it to run very large workflows. I have optimized the workflows to such an extent that I can switch all sorts of things on and off. For some things (e.g. FaceSwap or Controlnet) ComfyUi then switches to lowvram mode. That wouldn't be too bad, because I have to be patient here, with less VRAM. But ComfyUI does not find its way back, even if the memory-hungry nodes are switched off. For example, upscaling could also be performed without Controlnet. Unfortunately, I then always have to reload Comfy, but then I can also upscale the image without any problems. Maybe these CustomNodes are a first approach: https://github.com/ntdviet/comfyui-ext I have installed the LatentGarbageCollector node from there and it looks like there is enough free memory to work without lowvram. If it is not possible during the workflow run, should it somehow be possible to switch comfy back to normalvrm before a new run? Or when you reload a workflow? So far, unfortunately, only restart works for me. I would be very happy if someone knows or finds a solution.

mcmonkey4eva commented 2 months ago

Is there a way to clear the memory (VRAM) after a workflow run? There's an API route for it (post /free with { "unload_models": true, "free_memory": true }, and Swarm has a button in the Server tab for that. I don't think comfy itself currently has a button for it

Or possibly even during the run? that'd require a node or CLI arg. There are relevant CLI args listed here: https://github.com/comfyanonymous/ComfyUI/blob/master/comfy/cli_args.py#L103-L109

Generally by default comfy is designed to try to avoid running out of memory and will automatically unload on the fly as needed. This might not work as intended with some custom nodes if they load models and don't use that system.

murphylanga commented 1 month ago

Thank you very much for your answer