Closed kenic123 closed 2 months ago
Are you using custom nodes that are downloading models from Hugging Face?
model not found! you did not say where to put the models you also do not gave link to download why?
what models? you need to have ollama server running, ollama manages the models etc...
you can pull and work with any model you like
install ollama https://ollama.com/download
and pull from
in the nodes you will be able to select the models you have in ollama
model not found! you did not say where to put the models you also do not gave link to download why?
Just make sure you run the Ollama server prior to opening ComfyUI. If it is already running , simple reneter the IP address of the Ollama server and it will refresh the models list
Model not found locally, downloading from HuggingFace... Running the COMFYUI, it's taking up VRAM for a long time, pulling COMFYUI up to 50s/it Running the COMFYUI 运行时出现Model not found locally, downloading from HuggingFace...,长时间占用VRAM,把COMFYUI的速度拉到50s/it