pythongosssss / ComfyUI-Custom-Scripts

Enhancements & experiments for ComfyUI, mostly focusing on UI features
MIT License
1.73k stars 135 forks source link

use Lora Loader waiting "got prompt" long time( few minutes) #327

Open xueqing0622 opened 1 month ago

xueqing0622 commented 1 month ago

use Lora Loader is good for notes, but waiting "got prompt" long time(few minutes) image

use comfyui default Lora Loader not waiting "got prompt" at all.

wencxxxxxx commented 1 month ago

same

NeedsMoar commented 2 weeks ago

It doesn't take anywhere near the minute range for me even with a bunch of them (more like 10-15s every time I hit generate before the queue item even shows up), but it seems like with the updates to ComfyUI's execution method to make it build a list starting from the beginning instead of recursive from the back, something in both the checkpoint loader with image and lora loader with image from this pack are causing comfy to think the loaded model has changed every time. Since it seems to reload the checkpoint every time and I have enough ram / vram that files that small usually wouldn't get unloaded this seems to be the case. Then it has to re-apply everything. That doesn't account for all of the time though since usually first run of one of these checkpoints didn't take 10-15s longer than the 2nd run, 5 seconds maybe...

My normal workflows chain a bunch of these lora loaders together and use one of the checkpoint loaders because the image previews without having to do anything make it a lot faster. I've setup the right click / view model info to work with other loaders but it's massively slower thanks to the need to contact civitai which is slow at the best of times. This information should probably be locally cached anyway. I'm hitting more and more loras that were removed from the site for whatever reason (probably people protesting stability) and no longer have activation words listed, just whatever was used in the images that were generated with them and still on the site. Most of the time that's plenty but some loras had a ton of different activation keywords that did different things.

Anyway, I think this is from one of two things...

  1. The stored list of filename / image file name pairs; comfy generates the folder->filename map with a function call that globally caches results in a variable and performs lookup from that first... the only file access it performs is checking stored modification times on the cached directory vs. the actual directory with getmtime, which is fast on any OS. This data is then used for anything that does lookups from that directory. The cached data in the form of [tuple[list[str], dict[str, float], float]] where float is the modification time isn't sorted or changed from the order returned by the OS until passed back to the caller, . The node overrides in this extension inherit the map and its method of generation from the parent, then add things to it. Because python has the most enormous failure of an attempt at a class system on earth, I'm not sure if the parent data is modified but it doesn't really matter. A new list of an index mapped dict of names mapped to file names from the parent and an "option" containing the image if it exists is generated with enumerate(). This is then returned after sorting it by name. I'm not sure how enumerate decides ordering, but whatever the case the index doesn't seem to be used for anything except forming part of the identifier for items in the dropdown. I know for a fact that this breaks when an item without a preview image has one added through the UI and the javascript end forces a combo-box update. You'll get some error resembling "can't map item #xxx over list" if you try to run the workflow because the lora / checkpoint that's currently selected no longer contains data after this happens. It works after reloading the interface or after re-selecting the same item from the list. In other words it updates the list but the enumeration changes and the same item (based on filename) isn't automatically set so it's referencing an invalid item somehow. I suspect that this whole scheme somehow messes with comfy's ability to recognize that the LoRA or checkpoint is already loaded, too... or maybe the dynamic changes to node items from javascript do that. It seems like there should be some kind of method that can be implemented on the node to tell comfy that no cache update is required, but I'm not familiar with any of it.
  2. There's also that optional & hidden "prompt" parameter that never seems to be set to anything passed back to the comfy base class load_x method. It's only set by javascript in the readonly "examples" widget, and (I guess?) is the "STRING" output that isn't labeled as anything else or well defined anywhere. This widget is handled strangely and deleted entirely by the javascript when it's run if it doesn't contain anything apparently... then recreated later? This probably counts as a node update for the cache too, and a node update to the checkpoint loader implies reloading the whole checkpoint. With the lora loaders it's either reloading the lora or reloading it AND re-applying weights to the model which is what it looks like is happening.

I hope this gets fixed because the insta-previews are a huge usability thing. I have too many loras laying around to keep track of without them, especially given the monumentally stupid filenames some of them have.

CloudWalker-II commented 1 week ago

I have this issue as well.

xueqing0622 commented 1 week ago

现在是读取同文件夹同名的txt文件,并不需要时间,只是执行时,等待“got prompt”很长时间我不能理解。