AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
139.83k stars 26.5k forks source link

[Bug]: 1.4.0 lowram not working #11479

Open pakresi opened 1 year ago

pakresi commented 1 year ago

Is there an existing issue for this?

What happened?

i made a fresh install 1.4.0 RAM: 7gb VRAM: 16gb

--lowram is not working, still loading checkpoint to ram.

maybe not related but i tried even in settings "Keep models in VRAM" is checked

Steps to reproduce the problem

  1. fresh install 1.4.0
  2. --xformers --opt-channelslast --lowram --listen --api
  3. its imposible to start with 7gb ram

What should have happened?

it must load models to GPU

Version or Commit where the problem happens

v1.4.0

What Python version are you running on ?

Python 3.10.x

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

Other GPUs

Cross attention optimization

xformers

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

--xformers --opt-channelslast --lowram --listen --api

List of extensions

standart

Console logs

################################################################
Install script for stable-diffusion + Web UI
Tested on Debian 11 (Bullseye)
################################################################

################################################################
Running on banias user
################################################################

################################################################
Repo already cloned, using it as install directory
################################################################

################################################################
python venv already activate: /home/banias/stable-diffusion-webui/venv
################################################################

################################################################
Launching launch.py...
################################################################
Using TCMalloc: libtcmalloc_minimal.so.4
Python 3.10.7 (main, May 29 2023, 13:51:48) [GCC 12.2.0]
Version: v1.4.0
Commit hash: 394ffa7b0a7fff3ec484bcd084e673a8b301ccc8
Installing requirements
Launching Web UI with arguments: --xformers --opt-channelslast --lowram --listen --api
Loading weights [6ce0161689] from /home/banias/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
preload_extensions_git_metadata for 7 extensions took 0.00s
Running on local URL:  http://0.0.0.0:7860

To create a public link, set `share=True` in `launch()`.
Startup time: 12.3s (import torch: 4.7s, import gradio: 2.3s, import ldm: 1.9s, other imports: 1.6s, load scripts: 0.5s, create ui: 0.9s, gradio launch: 0.2s, add APIs: 0.2s).
Creating model from config: /home/banias/stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.

-----------------------

banias@instance-1:~$ free -m
               total        used        free      shared  buff/cache   available
Mem:            7430        2056        1104          12        4268        5084
Swap:              0           0           0
banias@instance-1:~$ free -m
               total        used        free      shared  buff/cache   available
Mem:            7430        5249         150          12        2031        1891
Swap:              0           0           0
banias@instance-1:~$ free -m
               total        used        free      shared  buff/cache   available
Mem:            7430        6351         157          12         921         789
Swap:              0           0           0

Additional information

No response

missionfloyd commented 1 year ago

It runs fine with 8GB without --lowram, so I would imagine that 7GB (how? mismatched sticks? IGPU taking 1GB? Doesn't matter...) would be fine too.

akx commented 1 year ago

I'm not seeing how "its imposible to start with 7gb ram" from the output here – there's no error at all.