Nuked88 / ComfyUI-N-Nodes

A suite of custom nodes for ConfyUI that includes GPT text-prompt generation, LoadVideo, SaveVideo, LoadFramesFromFolder and FrameInterpolator
MIT License
205 stars 22 forks source link

Memory Optimization - Batch Processing #51

Open toscano22 opened 7 months ago

toscano22 commented 7 months ago

First of all, kudos for an amazing work.

I was testing processing a bigger video into batches using the same checkpoint to process each batch of images. I noted the higher the batches the higher the number of checpoints that gets loaded in what it seems to be subprocesses, taking memory away for the actual processing. Since the model loaded is the same, couldn't you rather use the Singleton pattern, have a bool check if the same settings apply accros batches, if yes one model is loaded and stored as cache. With this, each batch subprocess would reference the loaded singleton model where only one instance of the model is available for each batch to reach to in comparison with having one model loaded per batch.

I may be wrong, but I've seen my VRAM decreasing at each line guiding the model was being loaded over and over.

Keep up with the good work.