deep-floyd / IF

Other
7.63k stars 495 forks source link

CUDA out of memory. #99

Open CORCTON opened 1 year ago

CORCTON commented 1 year ago

I have tried to run: this code, which causes my model to be loaded into VRAM, so when I run the notebook I get "CUDA out of memory." How can I reverse this code and remove it from VRAM?

1a80c209633b46b07221943d6d666c2
kanttouchthis commented 1 year ago

by changing device = "cuda:0" to device = "cpu" you can run the code on cpu without using VRAM, however that is going to be extremely slow. I recommend using the example code in the README.md using diffusers instead, as the cpu model offload significantly reduces VRAM requirements. If that isn't enough, you can try some of the options here to further reduce VRAM usage. if you need a pre-made notebook, you can find one on colab, though this might need modification for lower VRAM instances

phalexo commented 1 year ago

T5 needs about 11.6GiB If_I needs about 9.2GiB if_II + if_III need about 5.8GiB, each separately about 3GiB