stavsap / comfyui-ollama

Apache License 2.0
201 stars 17 forks source link

Model not found locally, downloading from HuggingFace... #9

Open kenic123 opened 1 month ago

kenic123 commented 1 month ago

Model not found locally, downloading from HuggingFace... Running the COMFYUI, it's taking up VRAM for a long time, pulling COMFYUI up to 50s/it Running the COMFYUI 运行时出现Model not found locally, downloading from HuggingFace...,长时间占用VRAM,把COMFYUI的速度拉到50s/it

dicksondickson commented 1 month ago

Are you using custom nodes that are downloading models from Hugging Face?

Amit30swgoh commented 2 days ago

model not found! you did not say where to put the models you also do not gave link to download why?

stavsap commented 1 day ago

what models? you need to have ollama server running, ollama manages the models etc...

you can pull and work with any model you like

install ollama https://ollama.com/download

and pull from

https://ollama.com/library

in the nodes you will be able to select the models you have in ollama