deep-floyd / IF

Other
7.69k stars 501 forks source link

How to download the model to local use? #126

Open 404-xianjin opened 1 year ago

404-xianjin commented 1 year ago

I downloaded the three models locally and modified the script file. image But I got an error like this image My question is, can't the model be loaded locally for use? Do I have to log in to huggingface every time I execute a script? My network speed is too slow, if I need to download every time, it will take a long time for me to wait. Who can help me, thank you very much. @shonenkov @zeroshot-ai @apolinario @Gugutse @ivksu @williamberman @sayakpaul @estability

sayakpaul commented 1 year ago

My question is, can't the model be loaded locally for use? Do I have to log in to huggingface every time I execute a script? My network speed is too slow, if I need to download every time, it will take a long time for me to wait. Who can help me, thank you very much.

The checkpoints within this library are downloaded with a utility called hf_hub_download() provided by the Hugging Face Hub library. The files downloaded with this utility are supposed to be automatically cached already actually.

404-xianjin commented 1 year ago

wow, it is indeed so. When I execute the script repeatedly, the model files are automatically cached. But I got another error, my graphics card model is NVIDIA A5000. image stage3 doesn't execute perfectly, getting the following error. image image thank you very much.

Jon-Zbw commented 1 year ago

i think you should clone the entire model repository,like :git clone https://huggingface.co/DeepFloyd/IF-I-XL-v1.0 without GIT_LFS_SKIP_SMUDGE=1, ps :i dont have a test