timothybrooks / instruct-pix2pix

Other
6.41k stars 542 forks source link

Cuda out of memory with 24GB VRAM #106

Open giuseppecartella opened 1 year ago

giuseppecartella commented 1 year ago

Hi!.

I made a script to generate images starting from a list of PIL images, following the edit_cli.py file. However, after a few generated images I get the "cuda out of memory" error. It seems that the memory keeps increasing. Any advice on how to solve or somewhat limit the memory consumption? (i am running the finetuned version from Magicbrush (https://github.com/OSU-NLP-Group/MagicBrush) therefore I cannot use diffusers library, given that the model has not been uploaded yet). I am running on a 24GB GPU.

Thank you!

giuseppecartella commented 1 year ago

I am using the model in evaluation mode and I have already set the "model.eval().cuda()"