hollowstrawberry / kohya-colab

Accessible Google Colab notebooks for Stable Diffusion Lora training, based on the work of kohya-ss and Linaqruf
GNU General Public License v3.0
599 stars 86 forks source link

How is it possible for your colab to run prodigy with only 16 VRAM ? #160

Closed githubbb5ty closed 1 month ago

githubbb5ty commented 4 months ago

Could you tell me which configurations allow you to run Prodigy with only 16 VRAM?

I rented a GPU with Vast AI and with Prodigy it uses 42 VRAM

What arguments should I change to reduce VRAM usage like in your collab ?

hollowstrawberry commented 4 months ago

Gradient checkpointing and cache latents to disk are the 2 most important settings