jitcoder / lora-info

56 stars 4 forks source link

[FEATURE REQUEST] - Output triggers #2

Closed alessandroperilli closed 9 months ago

alessandroperilli commented 9 months ago

It would be great if the Lora Info node would be able to export just the trigger words (rather than the entire output) to an output pin. In that way, I could further manipulate and automatically insert them in other nodes.

Thank you for considering my suggestion.

jitcoder commented 9 months ago

its a good idea, I think the node that I used as a reference to make this already does this: https://github.com/badjeff/comfyui_lora_tag_loader

I didn't play around with it much, but I think it does just that. my issue with it was that I wanted it to display info and examples etc. please do check the above out, if it doesn't work as expected, let me know and I'll add the feature into this

alessandroperilli commented 9 months ago

My suggestion does the reverse of what the LoRA Tag Loader does. I want to extract the trigger words from the LoRA Info dump and place them into an otherwise empty prompt node (or a debug node for logging purposes, etc.).

jitcoder commented 9 months ago

I see, let me see what I can do. (hopefully I understood everything correctly. this is day #2 with generative AI for me lol)

alessandroperilli commented 9 months ago

I know. I read your intro post on Reddit. I appreciate any time you'll dedicate to this request, if and when you'll feel comfortable with the complexities of the ComfyUI world. Even in its current form, the LoRA Info node is useful, and it's scheduled to be included in my upcoming AP Workflow 6.1.

jitcoder commented 9 months ago

So it turns out I can't combine 'displaying' functionality and exporting values, so I created a sibling node that does the exporting.

image

Is this what you had in mind?

alessandroperilli commented 9 months ago

Yes, this works fine (although I think your outputs are inverted: it seems that the trigger_words output is outputting the example prompt and vice versa).

It would be great to not have the opening and closing quotes for each output, but it's a minor issue.

However, and exclusively FYI, I think it's possible to both display and output at the same time. Look for example at the textDebug node in the ttN suite:

Screenshot 2023-12-09 at 12 39 59
jitcoder commented 9 months ago

Yes, this works fine (although I think your outputs are inverted: it seems that the trigger_words output is outputting the example prompt and vice versa).

However, and exclusively FYI, I think it's possible to both display and output at the same time. Look for example at the textDebug node in the ttN suite:

Screenshot 2023-12-09 at 12 39 59

oh sweet, thx for the info, I'll check it out

jitcoder commented 9 months ago
image
jitcoder commented 9 months ago

addressed: https://github.com/jitcoder/lora-info/commit/db2c5d4df5b438d7efe83d1ccedd88be66b82ad8

alessandroperilli commented 9 months ago

Super. It works great! Thanks for all the time you dedicated to this during your weekend.

UPDATE:

The node generates the following error for that particular LoRA I flagged in the other issue:

File "xyz/ComfyUI/execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "xyz/ComfyUI/execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "xyz/Tools/ComfyUI/execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "xyz/ComfyUI/custom_nodes/lora-info/lora_info.py", line 118, in lora_info (output, triggerWords, examplePrompt, baseModel) = get_lora_info(lora_name) ^^^^^^^^^^^^^^^^^^^^^^^^ File "xyz/ComfyUI/custom_nodes/lora-info/lora_info.py", line 64, in get_lora_info trainedWords = ",".join(model_info.get("trainedWords")) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: can only join an iterable

jitcoder commented 9 months ago

woops, accidently reintroduced in the refactor. I've fixed in the latest push

alessandroperilli commented 9 months ago

Now it works fine, thank you. At this point, you might consider adding your node to the DB list of ComfyUI Manager. So other people who didn't see the Reddit announcement will find it.

jitcoder commented 9 months ago

Now it works fine, thank you. At this point, you might consider adding your node to the DB list of ComfyUI Manager. So other people who didn't see the Reddit announcement will find it.

sweet. yeah I will soon. I'm going to investigate a custom node I saw that reacts to 'input change' without the user having to queue the workflow. I'd like for lora info to execute and show info as soon as the user selects a lora instead of having to hit queue.

I think I'm also going to add another "quality of life" node, like a 'prompt' selector. where the user can just select and use pre made prompts eg. Universal Negative Prompt, or Face Test that tests a workflow for face detailing accuracy, Hand Test or something, it'd be hooked up to github so community members can add/share their prompts.

I've already created a half built node that does this for me.

Will probably end up adding more nodes like this as I build out my workflow.

alessandroperilli commented 9 months ago

I've never seen a node that automatically reacts to state changes without queuing the generation, and I've seen hundreds. But if such a node exists and can be done, it would be amazing.

Re the prompt selector, there are dozens of them. You can find them by searching for "prompt" on this page.

I don't like any of them because even if you can see the list of prompt presents from dropdown menus, you never remember exactly what's in each preset. So I ended up creating a very large visual prompt builder for the AP Workflow. I prefer to see the full extent of the prompt I'll choose before choosing it.

jitcoder commented 9 months ago

I've never seen a node that automatically reacts to state changes without queuing the generation, and I've seen hundreds. But if such a node exists and can be done, it would be amazing.

Re the prompt selector, there are dozens of them. You can find them by searching for "prompt" on this page.

I don't like any of them because even if you can see the list of prompt presents from dropdown menus, you never remember exactly what's in each preset. So I ended up creating a very large visual prompt builder for the AP Workflow. I prefer to see the full extent of the prompt I'll choose before choosing it.

done ;)

Now it reacts to user input and updates/displays the output and base model immediately instead of having to queue.

jitcoder commented 9 months ago

wrt the prompt selector and given the above update (ie. the ability to update UI elements based on user input), I can probably put together a prompt selector node that shows whats in the preset.

I was also playing around building a workflow and started to feel the need for a "Any Switch". There are some switches in threerg but they are type specific and require a boolean input. I was thinking of a node that acts like a switch with a slider switch instead of requiring an input.

image

so when the switch is "off" it outputs whatever is in input_a, and when its "on" it outputs input_b

alessandroperilli commented 9 months ago

I've never seen a node that automatically reacts to state changes without queuing the generation, and I've seen hundreds. But if such a node exists and can be done, it would be amazing. Re the prompt selector, there are dozens of them. You can find them by searching for "prompt" on this page. I don't like any of them because even if you can see the list of prompt presents from dropdown menus, you never remember exactly what's in each preset. So I ended up creating a very large visual prompt builder for the AP Workflow. I prefer to see the full extent of the prompt I'll choose before choosing it.

done ;)

Now it reacts to user input and updates/displays the output and base model immediately instead of having to queue.

Remarkable. And thank you! If this can now read the selected LoRA model from an Efficient Node and a default LoRA Loader checkpoint, as we discussed, it becomes a must have for many, many people.

alessandroperilli commented 9 months ago

wrt the prompt selector and given the above update (ie. the ability to update UI elements based on user input), I can probably put together a prompt selector node that shows whats in the preset.

I was also playing around building a workflow and started to feel the need for a "Any Switch". There are some switches in threerg but they are type specific and require a boolean input. I was thinking of a node that acts like a switch with a slider switch instead of requiring an input.

image

so when the switch is "off" it outputs whatever is in input_a, and when its "on" it outputs input_b

Rgthree has a node called Any Switch which defaults to the first non-null value. That's my preference over all the other switches I tried so far because it requires no manual reconfiguration.

Also, in the Impact Pack, there's a Switch (Any), which allows for infinite input pins and has an embedded selector (no need for a boolean value as input). That is my second most favorite switch.

Screenshot 2023-12-11 at 09 42 57
jitcoder commented 9 months ago

wrt the prompt selector and given the above update (ie. the ability to update UI elements based on user input), I can probably put together a prompt selector node that shows whats in the preset. I was also playing around building a workflow and started to feel the need for a "Any Switch". There are some switches in threerg but they are type specific and require a boolean input. I was thinking of a node that acts like a switch with a slider switch instead of requiring an input.

image

so when the switch is "off" it outputs whatever is in input_a, and when its "on" it outputs input_b

Rgthree has a node called Any Switch which defaults to the first non-null value. That's my preference over all the other switches I tried so far because it requires no manual reconfiguration.

Also, in the Impact Pack, there's a Switch (Any), which allows for infinite input pins and has an embedded selector (no need for a boolean value as input). That is my second most favorite switch.

Screenshot 2023-12-11 at 09 42 57

Ah perfect, thanks!

I've been playing around with trying to get lora_name output a "non-hacky" way with no luck. I think what I'll have to do is give the user an option.

  1. Select lora from dropdown
  2. Read loras from workspace

if the user selects option 2 then (because there can be multiple different loras in multiple different nodes):

  1. Gather all lora_names.
  2. Fetch civitai data
  3. Show a selected_index option
  4. display lora data of lora at selected index
alessandroperilli commented 9 months ago

Before you try either implementation, I'd suggest you check the code behind this feature:

Screenshot 2023-12-11 at 10 39 58

You summon this popup by right-clicking on stock Lora Loader node and selecting View Info:

Screenshot 2023-12-11 at 10 40 50

Your implementation is much more useful, IMO, but maybe that code could be useful in this case?

jitcoder commented 9 months ago

Before you try either implementation, I'd suggest you check the code behind this feature:

Screenshot 2023-12-11 at 10 39 58

You summon this popup by right-clicking on stock Lora Loader node and selecting View Info:

Screenshot 2023-12-11 at 10 40 50

Your implementation is much more useful, IMO, but maybe that code could be useful in this case?

where is this node from? None of the loaders I have show that context menu option.

image
alessandroperilli commented 9 months ago

It's not a property of any specific node, but a capability that applies to the stock ComfyUI LoraLoader node when you install the Custom Scripts suite.