comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
51.89k stars 5.45k forks source link

is it possiable to shared cached models between two or more comfyui instances #3292

Open perlinson opened 5 months ago

perlinson commented 5 months ago

i have two gpus, i excute two comfyui instances ,on different ports and different gpu, these two instances may run the same workflow , and most of the used models was same, but it takes twice ram space , it's two large ram cost, i want to shared these same models among the running instances, it may take less ram space ,is it possiable? thank your for your reply!

efwfe commented 4 months ago

That's could be possible, but you have to change source code to use shared memory or some memory database like redis or memocache. I dont't think it's a good idea to do that.

Hope this helpful :)

ltdrdata commented 4 months ago

If you want to do that, you should run multiple server instances within a single ComfyUI process instead of running multiple ComfyUI processes.