Open jjangga0214 opened 1 year ago
The Efficiency Nodes extension (https://github.com/LucianoCirino/efficiency-nodes-comfyui) has a three Lora stacker node, as well as a checkpoint loader with one optional Lora.
I think the ask here is something like allowing <lora:foo.safetensors:1.4>
syntax, or something similar to how embeddings are done.
fyi, ImpactWildcardEncode node does this among other things
Hi!
As we know, in A1111 webui, LoRA(and LyCORIS) is used as prompt. IMHO, LoRA as a prompt (as well as node) can be convenient.
Simplicity
When using many LoRAs(e.g. for character, fashion, background, etc), it becomes easily bloated.
Text prompts can reduce this.
Sharability
People tend to share EXIF in A1111 format. Text prompt is good for copy-and-paste. Sometimes what users want is getting started as rapid as possible. That's because they are so excited by new ideas that just checking out the result ASAP has higher priority.
Experiment
Users of ComfyUI are more hard-core than those of A1111. They experiment a lot. In many cases, text is faster to edit (with autocompletion or text editors). For example, on A1111 webui, I use find-and-replace feature in VSCode for automatically replacing multiple LoRA weights at once. What's more, I "generate" a text of list of LoRA with specific weights and additional args from a simple script(like .js, .py, or .sh) for experiments. This practice saves time a lot and thus essential in my workflow.
Conclusion
I like node-based workflow. But things especially like LoRA can benefit us by being prompt. Prompt grammar is DSL, and well-defined DSL becomes really productive.
How do you think?
Thank you :)