blib-la / runpod-worker-comfy

ComfyUI as a serverless API on RunPod
GNU Affero General Public License v3.0
266 stars 179 forks source link

Custom nodes from the network storage pod aren't found #37

Open moteloumka opened 4 months ago

moteloumka commented 4 months ago

Following the readme file installation tutorial I got the feeling that adding custom nodes through a network storage should so the trick as long as they are in the right folder, however, having ran some requests, I see that any custom nodes are ignored. Sorry to ask here but is there any easy way to reach the extra_model_paths.yaml file with additional paths (I suppose that's the easiest way to add them) or will it be necessary to rebuild a docker container just in order to add a path to the custom nodes. I feel like I'm missing something here, if someone could please indicate how to solve this issue, I would be very thankful!

larchman01 commented 4 months ago

did you solve this?

moteloumka commented 4 months ago

@larchman01 not really, I made a new image, forking this repo and adding the path to the custom nodes in the network storage by adding a line custom_nodes : custom_nodes to the src/extra_model_paths.ymal file. Now comfyUI manages to find the custom nodes but the node themselves seem to have a hard time finding the models that are on the network storage... I imagine that there should be some way to fix this, otherwise the mere existence of this repo wouldn't make too much sense, yet it's currently unclear to me how to make things work.

TimPietrusky commented 4 months ago

@moteloumka @larchman01 yes sorry, this was supposed to be the solution, but then we saw that when we do use the custom nodes with a network volume, that they still need to install their dependencies per request. Which is something that is increasing the cold start time.

And we can't just add the custom nodes path to the yaml, because if the path doesn't exist, then ComfyUI will not start.

In https://github.com/blib-la/runpod-worker-comfy/pull/30 we are working on a solution to bake custom nodes directly into the Docker image.

Until this is resolved, you need to install them directly into the image manually as described in the README.

niko2020 commented 2 months ago

30 is very interesting, but if we have a highly customized ComfyUI installation (on a Runpod Volume), with some scripts that would not be restored by a Manager snapshot, what would be the best practice to bake it into the Docker image?

Cheers for this beautifull project!

alka7ex commented 1 month ago

Hi @TimPietrusky @moteloumka

So i trying to build a docker custom image also due to i need to add custom node, but the questions is do you guys build the images locally or using github actions?

im trying to build the docker images using github actions but facing an issues where the runner run out of disk is there any alternatives?

System.IO.IOException: No space left on device : '/home/runner/runners/2.319.1/_diag/Worker_20240907-125323-utc.log'

moteloumka commented 1 month ago

@alka7ex I found that the easiest way was installing docker and building the thing locally then pushing it on docker hub. It's a bit tricky bc you'll have to install docker and have good internet connection (otherwise downloading all the packages takes ages as much as uploading the whole thing into docker hub) but overall, if you figure out how to use the basics of docker, you're sure to have full control on your custom image and use this repo as a great starting spot that takes care of all the functionalities.

drommerkiller commented 1 week ago

@alka7ex I found that the easiest way was installing docker and building the thing locally then pushing it on docker hub

Is there a guide on building a comfyui docker image that will work with this repo? As you said, it takes time so so a guide how to do it would be great. There is a guide how to create stable diffusion runpod image, but comfy differs enough that it did not work when i tested.

Custom nodes and the dependencies they need are quite essential for ComfyUI. I'm looking alternative for Replicate as there the coldboots/queue has gone off the roof for custom comfyui model. I'm talking 5-15min waiting for running 30sec inference.