lks-ai / anynode

A Node for ComfyUI that does what you ask it to do
MIT License
426 stars 27 forks source link

Local .gguf model support #26

Closed MilitantHitchhiker closed 2 weeks ago

MilitantHitchhiker commented 3 weeks ago

Wondering if it's possible to add direct support for local .gguf llm models.

lks-ai commented 2 weeks ago

If you can load gguf on ollama or vLLM, then definitely. AnyNode doesn't load models, it calls the API endpoint for your local LLM host systems like ollama, LM Studio and vLLM. If they support GGUF, you can use a GGUF model in AnyNode by pointing it correctly.