AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
142.94k stars 26.94k forks source link

[Bug]: Xformers available but not #5898

Open mmann1123 opened 1 year ago

mmann1123 commented 1 year ago

Is there an existing issue for this?

What happened?

xformers is installed and available in my conda env yet not.

To demonstrate that xformers is working: python -m xformers.info

xFormers 0.0.15+ea1048b.d20221220
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.flshattF:               available
memory_efficient_attention.flshattB:               available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        available
memory_efficient_attention.tritonflashattB:        available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
is_functorch_available:                            False
pytorch.version:                                   1.13.0+cu117
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3080 Ti

But when running

bash <(wget -qO- https://raw.githubusercontent.com/AUTOMATIC1111/stable-diffusion-webui/master/webui.sh)

Im getting:

Dreambooth revision: c678a431b5d79ba7e1c38d7629ca7e0c79166a1f
SD-WebUI revision: 685f9631b56ff8bd43bce24ff5ce0f9a0e9af490

Checking Dreambooth requirements...
[+] bitsandbytes version 0.35.0 installed.
[+] diffusers version 0.10.2 installed.
[+] transformers version 4.25.1 installed.
[ ] xformers version N/A installed.
[+] torch version 1.12.1+cu113 installed.
[+] torchvision version 0.13.1+cu113 installed.
#######################################################################################################

Launching Web UI with arguments: 
No module 'xformers'. Proceeding without it.

Steps to reproduce the problem

mamba create -n sd_au1111_v2 python=3.10.6 pytorch torchvision pytorch-lightning torchaudio pytorch-cuda diffusers transformers ftfy -c pytorch -c nvidia  -y
mamba activate sd_au1111_v2
pip install ninja triton functorch
pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers

bash <(wget -qO- https://raw.githubusercontent.com/AUTOMATIC1111/stable-diffusion-webui/master/webui.sh)

Ubuntu 22.04 nvcc: NVIDIA (R) Cuda compiler driver Cuda compilation tools, release 11.7, V11.7.99 Build cuda_11.7.r11.7/compiler.31442593_0

What should have happened?

Assuming xformers should be available

Commit where the problem happens

685f9631b56ff8bd43bce24ff5ce0f9a0e9af490

What platforms do you use to access UI ?

Linux

What browsers do you use to access the UI ?

Mozilla Firefox

Command Line Arguments

`./webui.sh --xformers` fails to install as well

: Couldn't install xformers.
stderr:   error: subprocess-exited-with-error

Additional information, context and logs

No response

aliencaocao commented 1 year ago

run pip list and see if xformers is inside

mmann1123 commented 1 year ago

Yep its there xformers 0.0.15+ea1048b.d20221220

aliencaocao commented 1 year ago

its most likely your python command aliasing is off. In launch.py it runs shell commands using python and if that is not the same python as you have then it will be in a different environment. Do you only have 1 python version installed?

mmann1123 commented 1 year ago

I use miniconda. I ran the python -m xformers.info command with my active environment. For instance:

which python
/home/mmann1123/miniconda3/envs/sd_au1111_v2/bin/python
vmajor commented 1 year ago

Same here. xformers not detected.

pip list
xformers                  0.0.15+e163309.d20221226
python -m xformers.info
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton.language'
xFormers 0.0.15+e163309.d20221226
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.flshattF:               available
memory_efficient_attention.flshattB:               available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        unavailable
memory_efficient_attention.tritonflashattB:        unavailable
swiglu.fused.p.cpp:                                available
is_triton_available:                               False
is_functorch_available:                            False
pytorch.version:                                   1.13.1
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3060
vmajor commented 1 year ago

I fixed it by editing the launch.py

commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers")

WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.

aliencaocao commented 1 year ago

It also seems to have installed its own version inside its own venv.

This is the correct behaviour. You are not supposed to install it outside its own venv unless you set venv to - which means use the global system environment

I fixed it by editing the launch.py

You can just pass the same to webui-user.bat/.sh

uwusensei commented 1 year ago

FYI, you have to currently use --xformers . --force-enable is currently broken due to https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/master/modules/import_hook.py unsetting xformers if you don't pass the --xformers flag specifically

atuarre commented 1 year ago

I fixed it by editing the launch.py

commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers")

WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.

Thank you for posting this. This did help me get xformers loaded.

giusparsifal commented 4 months ago

I fixed it by editing the launch.py commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers") WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.

Thank you for posting this. This did help me get xformers loaded.

Hello, I have to write down the entire string? Because in my launch.py there is no "commandline_args = os.environ.get" at all. If yes, where I have to add that string in launch.py? Thanks!