stavsap / comfyui-ollama

Apache License 2.0
372 stars 34 forks source link

Model not found locally, downloading from HuggingFace... #9

Closed kenic123 closed 2 months ago

kenic123 commented 6 months ago

Model not found locally, downloading from HuggingFace... Running the COMFYUI, it's taking up VRAM for a long time, pulling COMFYUI up to 50s/it Running the COMFYUI 运行时出现Model not found locally, downloading from HuggingFace...,长时间占用VRAM,把COMFYUI的速度拉到50s/it

dicksondickson commented 5 months ago

Are you using custom nodes that are downloading models from Hugging Face?

Amit30swgoh commented 4 months ago

model not found! you did not say where to put the models you also do not gave link to download why?

stavsap commented 4 months ago

what models? you need to have ollama server running, ollama manages the models etc...

you can pull and work with any model you like

install ollama https://ollama.com/download

and pull from

https://ollama.com/library

in the nodes you will be able to select the models you have in ollama

smoran commented 3 months ago

model not found! you did not say where to put the models you also do not gave link to download why?

Just make sure you run the Ollama server prior to opening ComfyUI. If it is already running , simple reneter the IP address of the Ollama server and it will refresh the models list