Jack000 / glid-3-xl-stable

stable diffusion training
MIT License
290 stars 36 forks source link

Suggestion - Using Hivemind for distributed training #1

Closed chavinlo closed 1 year ago

chavinlo commented 2 years ago

Could it be possible to use Hivemind to distribute the compute? https://github.com/learning-at-home/hivemind Also is there a way to lower the vram usage?

Jack000 commented 1 year ago

yeah it could be hard, not too familiar with hivemind.

You could lower vram usage by pre-encoding the images and disabling EMA, but doing so will make it barely fit on a 24gb card with batch size 1 for 256x256. Deepspeed CPU offloading might work, but I'm not sure how much it will save.