Open pakresi opened 1 year ago
It runs fine with 8GB without --lowram
, so I would imagine that 7GB (how? mismatched sticks? IGPU taking 1GB? Doesn't matter...) would be fine too.
I'm not seeing how "its imposible to start with 7gb ram" from the output here – there's no error at all.
Is there an existing issue for this?
What happened?
i made a fresh install 1.4.0 RAM: 7gb VRAM: 16gb
--lowram is not working, still loading checkpoint to ram.
maybe not related but i tried even in settings "Keep models in VRAM" is checked
Steps to reproduce the problem
What should have happened?
it must load models to GPU
Version or Commit where the problem happens
v1.4.0
What Python version are you running on ?
Python 3.10.x
What platforms do you use to access the UI ?
Linux
What device are you running WebUI on?
Other GPUs
Cross attention optimization
xformers
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
List of extensions
standart
Console logs
Additional information
No response