Sygil-Dev / sygil-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
7.85k stars 884 forks source link

[DOCUMENTATION] Add a guide for running the webserver with 4gb vram #191

Closed pinilpypinilpy closed 1 year ago

pinilpypinilpy commented 1 year ago

I did some tests on my 3050 ti mobile, and found that you can run stable diffusion with the webui on 4gb vram. (barely. When generating images at 512x512, I'm less than 20mb from running out of memory)

It would be nice if you could add a small section to the readme for people starved of vram. There are a few steps needed to get it working.

  1. Use Linux (haven't tested on windows)
  2. Don't use a DE (Desktop environment), or if you have integrated graphics, you can run it off that.
  3. Launch scripts/webui.py directly and add PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:128 before the command, as well as --optimized to the end ( ex. PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:128 python scripts/webui.py --optimized ). You could also edit the relauncher.
  4. Visit the webserver from another device on the network (run ip a on the host, then add a colon and the port number, ex. 192.168.X.XX:7860)
hlky commented 1 year ago

I can add it to the wiki, if someone wants to have a go at it feel free, otherwise it's on my list of things to document so I'll get round to it

pinilpypinilpy commented 1 year ago

Thanks!

antler5 commented 1 year ago

I've got a GeForce GTX 1050 Ti (4GB), can run basujindal's Repo or it's built-in Gradio UI just fine, but even without the models for Face / Up-scaling in-place something about this Repo's stack pushes the VRAM requirements from 'workable' to not. I can just barely get the UI running with the tips in this issue, but can't generate a single iteration of an image. I've spent some time exploring the Dockerfiles and .py's to try to figure out what the difference is between Basujindal's txt2img_gradio.py and this repo's webui.py, but can't tell where the extra resource use is coming from, it's just not my area of expertise. Can't hurt to mention it though, and great work! I'm still glad this amazing repo is available.