rgthree / rgthree-comfy

Making ComfyUI more comfortable!
MIT License
1.03k stars 77 forks source link

Power Prompt gives error when using <lora:> tag without model/clip inputs #312

Closed MantisCore closed 1 month ago

MantisCore commented 1 month ago

When adding LoRA to the text prompt in Power Prompt via the tag, either by selecting it in the menu or entering it manually, the node will throw an error if model/clip are not connected.

I use two Power Prompts in sequence. The first to select LoRA and prompt text. This does not use the model/clip inputs because I do not want it to strip this information from the prompt text (I save this info in the metadata). This raw text is then fed into a second Power Prompt node for conditioning. This used to work fine until recently but seems to be broken now.

I am pretty new to Python, but it looks like it is trying to assign a 2-value tuple to a 4-value one.

rgthree commented 1 month ago

Thanks for the report. This should be fixed with the latest now.