AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
141.48k stars 26.74k forks source link

[Bug]: A matching Triton is not available #7115

Closed CypherQube closed 1 year ago

CypherQube commented 1 year ago

Is there an existing issue for this?

What happened?

Since a recent update the webui is giving me an error when it comes to xformers

Steps to reproduce the problem

Launching the webui

What should have happened?

There should be no error, there were no errors previously

Commit where the problem happens

e8c3d03f7d9966b81458944efb25666b2143153f

What platforms do you use to access UI ?

Windows

What browsers do you use to access the UI ?

Mozilla Firefox, Brave

Command Line Arguments

--xformers

Additional information, context and logs

Launching Web UI with arguments: --xformers A matching Triton is not available, some optimizations will not be enabled. Error caught was: No module named 'triton'

vanhoe0 commented 1 year ago

Before git pull, I've deleted the venv. After git pull and launching the wubui-user.bat, it shows the triton error.

Creating venv in directory C:\stable-diffusion-webui\venv using python "C:\Users\vanhoe\AppData\Local\Programs\Python\Python310\python.exe"
venv "C:\stable-diffusion-webui\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug  1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Commit hash: e407d1af897a7896d8c81e32dc86e7eb753ce207
Installing torch and torchvision
Installing clip
Installing open_clip
Installing xformers
Installing requirements for CodeFormer
Installing requirements for Web UI
Installing send2trash==1.8.0
Installing dynamicprompts[attentiongrabber,magicprompt]==0.2.6

Launching Web UI with arguments: --xformers --ckpt-dir D:\MODEL
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
Loading booru2prompt settings
C:\stable-diffusion-webui\venv\Scripts\python.exe
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Loading weights [038ba203d8] from D:\MODEL\AbyssOrangeMix2_sfw.safetensors
Loading VAE weights specified in settings: D:\MODEL\VAE\autoencoder_fix_kl-f8-trinart_characters.vae.pt
Applying xformers cross attention optimization.
Textual inversion embeddings loaded(1): bad_prompt_version2
Model loaded in 7.5s (0.3s create model, 5.9s load weights).
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
ChinatsuHS commented 1 year ago

same issue: seems this new module is currently bugged (wheels failed).

is also a Linux only module currently .. so that may be why it is failing under windows

meky84 commented 1 year ago

Same issue, Win11 user.

JackEllie commented 1 year ago

This condition is normal in Windows environments, please ignore it, although I believe this error message should be hidden.

5939

mezotaken commented 1 year ago

865af20d8a4a823df3c950f5c9c9092a541bc57a

tsound97 commented 1 year ago

@mezotaken I met the same problem and solved it with this: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/6601 Some of my friends said missing Triton causes the generate result to be blurry at sometimes.

mezotaken commented 1 year ago

yes, that's highly experimental. Or do you mean that it works for torch 1.13.1?

tsound97 commented 1 year ago

yes, that's highly experimental. Or do you mean that it works for torch 1.13.1?

Yes, it works after I update the torch and xformers

Testertime commented 1 year ago

865af20

This issue can occur on Linux too, not just "falsely" on Windows as claimed here. For instance, I was able to successfully compile xformers on Paperspace for the M4000. However, installing it and starting the web UI throws the "Matching Triton is not available" and it did not enable xformers.

Now guess what after I updated the web UI with this commit? Error message gone, but this is STILL a problem. This error message should be still there to show that something with xformers isn't working or compatible on the platform. Because turns out, the hardware is just too old to utilize the performance optimization of xformers properly.

But "yada yada let's just hide the message" doesn't fix the underlying issue of this

MadhuSaran26 commented 1 year ago

I got the same error while trying to install xformers in a conda environment. The issue seems to have been resolved by

  1. Installing triton using - pip install triton
  2. Confirming the availability of triton using - python -m xformers.info output
GioPetro commented 1 year ago

Hi,

Seems triton is not a valid lib. My pip couldn't find any package named triton. I've searched: https://pypi.org/search/?q=triton triton=2.1.0 fails as well. Any other library that should try out? I'm on W11... Thanks.

enochianborg commented 1 year ago
  1. pip install triton

This may solve on a Linux install but because there is no release Candidate for Triton for Windows it does not solve the OP problem. I tried and result was : >pip install triton WARNING: Ignoring invalid distribution -orch (c:\users\enoch\appdata\local\programs\python\python310\lib\site-packages) ERROR: Could not find a version that satisfies the requirement triton (from versions: none) ERROR: No matching distribution found for triton I also ran into the Triton error when trying to create loRA's using Kohya GUI and found the only workaround is to not use fp16 but use none and use fp16 for output only. Hopefully there will be a fix to this in the near future with a release for Windows at Triton. Reference site for Triton project https://pypi.org/project/triton/#files

merket commented 4 days ago
  1. pip install triton

pip install triton ERROR: Could not find a version that satisfies the requirement triton (from versions: none) ERROR: No matching distribution found for triton

merket commented 4 days ago
  1. pip install triton

This may solve on a Linux install but because there is no release Candidate for Triton for Windows it does not solve the OP problem. I tried and result was : >pip install triton WARNING: Ignoring invalid distribution -orch (c:\users\enoch\appdata\local\programs\python\python310\lib\site-packages) ERROR: Could not find a version that satisfies the requirement triton (from versions: none) ERROR: No matching distribution found for triton I also ran into the Triton error when trying to create loRA's using Kohya GUI and found the only workaround is to not use fp16 but use none and use fp16 for output only. Hopefully there will be a fix to this in the near future with a release for Windows at Triton. Reference site for Triton project https://pypi.org/project/triton/#files

Hi I am in the exact same situation, trying to train a lora using kohya. When you say not to use fp16; I see two options regarding that "Mixed precision" and "Save precision" both set to fp16. Do you suggest I set them both to "none"? Also does setting fp16 to none effect any quality? And when you say "use fp16 for output only" where do I set that?