Closed missscott closed 3 months ago
lease help, why can I run Ollama 3.1 locally, but comfyui gives an error when I enter the model name?
the llama3.1 LLM dont support vision (image input), you are using a vision node.
you can use this for vision https://ollama.com/library/llava
lease help, why can I run Ollama 3.1 locally, but comfyui gives an error when I enter the model name?