Closed MilitantHitchhiker closed 2 weeks ago
If you can load gguf on ollama or vLLM, then definitely. AnyNode doesn't load models, it calls the API endpoint for your local LLM host systems like ollama, LM Studio and vLLM. If they support GGUF, you can use a GGUF model in AnyNode by pointing it correctly.
Wondering if it's possible to add direct support for local .gguf llm models.