Closed illtellyoulater closed 1 year ago
Sorry myfriend,
The difference in VRAM requirements is because, in the Chinese SDWEBUI community, there are many modified SDWEBUI integration packages suck as AKI that have optimized VRAM usage effectively. With their optimizations combined with some of Windows' VRAM swapping mechanisms, it is possible to run most features even with 12GB of VRAM. I have seen instances in the community where 8GB(even6GB but too slow) Windows systems successfully run most features. However, on Linux, the original SDWEBUI settings might cause the usage to spike to 13GB when running main features. If you can correctly set up --xformers, you might be able to run most features based on SD 1.5 with 12GB.**
If you meet some problem, we‘d like to help you solve those problem, and we are pleased to recieve you PR to improve our usage!
From the README:
Requirements for Windows 10: Nvidia 3060 12 GB Requirements for Linux: Nvidia A10 24GB & Nvidia V100 16GB & Nvidia A100 40GB
How can it be so different? Perhaps the authors meant to say the above is the hardware they could test the software on, and not strictly the requirements ?
Why with ChatGPT and all these crazy things we are seeing people are still falling for language barriers?