Closed qrrbszopgu closed 1 month ago
I will think about it. Maybe setting the model as optional. But it is needed for the model variables, so it would reduce functionality.
thank you, when I generate using an API it would be nice to save the RAM
Just released a new version. Let me know if this works for you. You can connect the model to a string node with the models class name so the system variables keep working. You can check the env_info variable in ppp_comfyui.py to see which values to use.
Note that you will have to create again the node or weird things happen with the inputs.
It would be nice if there was a second node that didn't need a model input, maybe with just the model name as a text?
For using it in workflows where I just need processing for positive/negative prompt text but not generating locally. I know I can attach a dummy model but would rather not keep loading it for every run.