When adding LoRA to the text prompt in Power Prompt via the tag, either by selecting it in the menu or entering it manually, the node will throw an error if model/clip are not connected.
I use two Power Prompts in sequence. The first to select LoRA and prompt text. This does not use the model/clip inputs because I do not want it to strip this information from the prompt text (I save this info in the metadata). This raw text is then fed into a second Power Prompt node for conditioning. This used to work fine until recently but seems to be broken now.
I am pretty new to Python, but it looks like it is trying to assign a 2-value tuple to a 4-value one.
When adding LoRA to the text prompt in Power Prompt via the tag, either by selecting it in the menu or entering it manually, the node will throw an error if model/clip are not connected.
I use two Power Prompts in sequence. The first to select LoRA and prompt text. This does not use the model/clip inputs because I do not want it to strip this information from the prompt text (I save this info in the metadata). This raw text is then fed into a second Power Prompt node for conditioning. This used to work fine until recently but seems to be broken now.
I am pretty new to Python, but it looks like it is trying to assign a 2-value tuple to a 4-value one.