ai-dock / comfyui

ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
Other
605 stars 217 forks source link

Newbie question: Run a serverless on-demand API backend on runpod.io or vast.ai #88

Open gymnae opened 2 months ago

gymnae commented 2 months ago

Hi, I'm a hobbyist and occasionally would like to create images or run queries. Serverless seems like a good fit with cost control. I tried to use this image for a serverless installation on runpod, but was unable to call the API as a backend from my local ComfyUI install. Would this work in a worker/serverless setting, or would I need to run an instance on vast.ai or runpod?

Cheers

robballantyne commented 2 months ago

I'll have a demo for Vast's autoscaler ready soon.

You'll generally create a handler that's invoked by the serverless controller and should interact with the ComfyUI on localhost:18188 - I.e not external:8188 because that requires caddy to run and you won't want that because we should be fast to start on serverless.

You can achieve the fast start by declaring environment variable SERVERLESS=true or SUPERVISOR_NO_AUTOSTART=caddy,jupyter, syncthing as required

I'm about 90% done with a universal async API wrapper for processing ComfyUI workflows which should make serverless integration very easy - I just have to finish adding timings and webhooks https://github.com/ai-dock/comfyui/tree/main/build/COPY_ROOT_1/opt/ai-dock/api-wrapper

gymnae commented 2 months ago

That sounds great :) I'd prefer to run it on vast, so cool :)

field-mouse commented 2 weeks ago

I'm about 90% done with a universal async API wrapper for processing ComfyUI workflows which should make serverless integration very easy - I just have to finish adding timings and webhooks

Any updates?