comfyanonymous / ComfyUI

The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
42.69k stars 4.51k forks source link

Integration with an externally hosted backend (LocalAI?) #3365

Open jtwolfe opened 2 months ago

jtwolfe commented 2 months ago

Just thinking aloud.

How easy would it be to separate the functionality of the backend and abstract it away to a hosting tool like localai?

I have been investigating methods for consolidating resource requirements for AI tools and one of the topics that I keep hitting on is 'model size' which may become especially important as larger models become available or required to reach certain quality and performance standards. The thought process is 'if you have 20 artists and they are all using generative tools at about a 5% duty time, currently if they all wanted to use comfyui then they would each have to have a system with enough gpu resources to host the model'. By externalising the backend to some other system then instead of a 5% duty time for 20 systems you would have a 100% duty time for 1 system. I see that there is some separation between the backend and the frontend but it is hard to tell if the integration into localai is easy without spending considerable time. also making the backend accessible via the internet may require authentication elements that as yet are not in localai (but are being discussed)

ltdrdata commented 2 months ago

https://www.reddit.com/r/comfyui/s/lbdT0EmSuc

Maybe you will be interested to this.

jtwolfe commented 2 months ago

hmmm