huchenlei / ComfyUI_omost

ComfyUI implementation of Omost
Apache License 2.0
338 stars 21 forks source link

I hope to install an offline model #32

Open smae08 opened 3 weeks ago

smae08 commented 3 weeks ago

We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like lllyasviel/omost-dolphin-2.9-llama3-8b-4bits is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

I downloaded the model for the first time, but then restarted the computer and had to download it again. It's a waste of time, and it would be great if it could improve the inference speed, such as using local Ollmam

didikee commented 2 days ago

Like you, I have already downloaded it once in the omost project. In this project, you should not need to download it again. Just put the previous model in the corresponding location. Where is this location now? In the omost project, it is project_root/hf_download/hub/