Closed mooonwalker1983 closed 10 months ago
Also test on --lowvram (not --lowram) I'm definitely sure it can work on your machine. Could it be a driver issue of some sort?
I've got the latest Nvidia drivers, but you're right, I can't see any reason why this wouldn't work. It works fine for non SDXL models, but anything SDXL based fails to load :/
the general problem was in swap file settings. it works in auto mode for windows os
thanks so much.it's work.
1.6.0-RC should have resolved this. Also see the wiki for more information: https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Optimum-SDXL-Usage
The minimum graphics memory requirement for SDX 1.0 is 12GB+
Let's help stop this misinformation floating around the internet 👌🙂
SDXL 1.0 doesn't need 12GB+
SDXL 1.0 works just fine with 8GB RAM. But if you're using it with the AUTOMATIC1111 UI, then yeah, you'll need 12GB+
The wiki already explains that. 4GB even works.
Is there an existing issue for this?
What happened?
I have install and update automatic1111, put SDXL model in models and it dont play, trying to start but failed. but It works in ComfyUI . RTX 4060TI 8 GB, 32 GB, Ryzen 5 5600
Steps to reproduce the problem
i dont know
What should have happened?
errors
Version or Commit where the problem happens
1.5.0
What Python version are you running on ?
Python 3.10.x
What platforms do you use to access the UI ?
Windows
What device are you running WebUI on?
Nvidia GPUs (RTX 20 above)
Cross attention optimization
Automatic
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
List of extensions
Console logs
Additional information
No response