Open mmann1123 opened 1 year ago
run pip list
and see if xformers is inside
Yep its there
xformers 0.0.15+ea1048b.d20221220
its most likely your python command aliasing is off.
In launch.py it runs shell commands using python
and if that is not the same python as you have then it will be in a different environment. Do you only have 1 python version installed?
I use miniconda. I ran the python -m xformers.info
command with my active environment. For instance:
which python
/home/mmann1123/miniconda3/envs/sd_au1111_v2/bin/python
Same here. xformers not detected.
pip list
xformers 0.0.15+e163309.d20221226
python -m xformers.info
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton.language'
xFormers 0.0.15+e163309.d20221226
memory_efficient_attention.cutlassF: available
memory_efficient_attention.cutlassB: available
memory_efficient_attention.flshattF: available
memory_efficient_attention.flshattB: available
memory_efficient_attention.smallkF: available
memory_efficient_attention.smallkB: available
memory_efficient_attention.tritonflashattF: unavailable
memory_efficient_attention.tritonflashattB: unavailable
swiglu.fused.p.cpp: available
is_triton_available: False
is_functorch_available: False
pytorch.version: 1.13.1
pytorch.cuda: available
gpu.compute_capability: 8.6
gpu.name: NVIDIA GeForce RTX 3060
I fixed it by editing the launch.py
commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers")
WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.
It also seems to have installed its own version inside its own venv.
This is the correct behaviour. You are not supposed to install it outside its own venv unless you set venv to -
which means use the global system environment
I fixed it by editing the launch.py
You can just pass the same to webui-user.bat/.sh
FYI, you have to currently use --xformers . --force-enable is currently broken due to https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/master/modules/import_hook.py unsetting xformers if you don't pass the --xformers flag specifically
I fixed it by editing the launch.py
commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers")
WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.
Thank you for posting this. This did help me get xformers loaded.
I fixed it by editing the launch.py
commandline_args = os.environ.get('COMMANDLINE_ARGS', "--xformers")
WebUI does not look for xformers otherwise. It also seems to have installed its own version inside its own venv.Thank you for posting this. This did help me get xformers loaded.
Hello, I have to write down the entire string? Because in my launch.py there is no "commandline_args = os.environ.get" at all. If yes, where I have to add that string in launch.py? Thanks!
Is there an existing issue for this?
What happened?
xformers is installed and available in my conda env yet not.
To demonstrate that xformers is working:
python -m xformers.info
But when running
bash <(wget -qO- https://raw.githubusercontent.com/AUTOMATIC1111/stable-diffusion-webui/master/webui.sh)
Im getting:
Steps to reproduce the problem
Ubuntu 22.04 nvcc: NVIDIA (R) Cuda compiler driver Cuda compilation tools, release 11.7, V11.7.99 Build cuda_11.7.r11.7/compiler.31442593_0
What should have happened?
Assuming xformers should be available
Commit where the problem happens
685f9631b56ff8bd43bce24ff5ce0f9a0e9af490
What platforms do you use to access UI ?
Linux
What browsers do you use to access the UI ?
Mozilla Firefox
Command Line Arguments
Additional information, context and logs
No response