lks-ai / anynode

A Node for ComfyUI that does what you ask it to do
MIT License
500 stars 32 forks source link

[Feature Request] Allow the loading of a local llm/T5 model within the node, instead of accessing another server. #18

Open GalaxyTimeMachine opened 5 months ago

GalaxyTimeMachine commented 5 months ago

I have several llm and T5 models, already downloaded, that can be used within a workflow to enhance prompts. Would it be possible to select one of these same models within the dropdown options of the "AnyNode"?

For example, using a loader like the T5 loader: image

lks-ai commented 5 months ago

If you can run it on a server, you can... I mean, I have no idea and have never used that man. Seems out of scope.

GalaxyTimeMachine commented 5 months ago

It just runs within ComfyUI and is called once for the task, but isn't something that runs as a server. This seems like the most efficient way to use AnyNode.