sddebz / stable-diffusion-krita-plugin

GNU Affero General Public License v3.0
423 stars 34 forks source link

./webui.sh uses about 2.5GB vRAM. Is that ok? #39

Closed YAFU closed 2 years ago

YAFU commented 2 years ago

Hi. Using Linux here. I have successfully installed the Krita plugin and all dependencies with ./webui.sh. Then wanting to use the plugin by running ./webui.sh in a terminal, when the server does initialize it takes 2.5Gb of vRAM without even having Krita open yet. My GPU has 4GB of vRAM, so then I get CUDA out of memory even with 200x200p images.

Is it correct that the server uses so much vRAM just by initializing itself?

When run ./webui.sh at the end shows:

LatentDiffusion: Running in eps-prediction mode DiffusionWrapper has 859.52 M params. making attention of type 'vanilla' with 512 in_channels Working with z of shape (1, 4, 32, 32) = 4096 dimensions. making attention of type 'vanilla' with 512 in_channels Loading weights [7460a6fa] from /media/DISK1/@@stable-diffusion-krita-plugin/stable-diffusion-krita-plugin/models/model.ckpt Global Step: 470000 Model loaded. INFO: Started server process [1360360] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit) Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().

From nvidia-smi python3 is using 2469MiB

sddebz commented 2 years ago

Do you use any low vRAM options like --lowvram or --midvram? Without them it should not be possible to run the model on 4GB. I was only able to run the model on 512x640 on my 8GB VRAM card with default options.

Please look at the guides from parent repository: https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Features#4gb-videocard-support https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Optimizations https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Run-with-Custom-Parameters

webui.bat and webui.sh from this repo should accept all options from AUTOMATIC1111's one.

YAFU commented 2 years ago

Thank you very much, that did the trick. I have edited "webui-user.sh" file with the line:

export COMMANDLINE_ARGS="--lowvram"

And now python3 process when starting server just use about 400MiB of vRAM