lllyasviel / stable-diffusion-webui-forge

GNU Affero General Public License v3.0
8.44k stars 825 forks source link

[Bug]: First install ModuleNotFoundError: import of xformers halted; none in sys.modules #323

Open MysticDaedra opened 9 months ago

MysticDaedra commented 9 months ago

Checklist

What happened?

I'm trying to install SD Forge for the first time, I've run webui_user.bat, and every time I get the same error: ModuleNotFoundError: import of xformers halted; none in sys.modules. There is a traceback.

Steps to reproduce the problem

  1. Start webui_user.bat.
  2. Watch the console as it errors out.

What should have happened?

WebUI should have installed xformers and started normally. No extensions, no... nothing. Literally just trying to get it running the first time.

What browsers do you use to access the UI ?

Mozilla Firefox

Sysinfo

Can't access webui since it keeps erroring out before loading. That being said, here is some basic info, if it matters (pretty sure it doesn't): Windows 11 Professional RTX 3070 32gb RAM R7 5700X

Console logs

venv "D:\SD Forge\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug  1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: f0.0.14v1.8.0rc-latest-184-g43c9e3b5
Commit hash: 43c9e3b5ce1642073c7a9684e36b45489eeb4a49
Launching Web UI with arguments: --ckpt-dir D:\Stable Diffusion Files\Models\Checkpoints --hypernetwork-dir D:\Stable Diffusion Files\Models\Hypernetworks --embeddings-dir D:\Stable Diffusion Files\Models\Embeddings --lora-dir D:\Stable Diffusion Files\Models\Loras
Total VRAM 8192 MB, total RAM 32668 MB
WARNING:xformers:A matching Triton is not available, some optimizations will not be enabled
Traceback (most recent call last):
  File "D:\SD Forge\venv\lib\site-packages\xformers\__init__.py", line 55, in _is_triton_available
    from xformers.triton.softmax import softmax as triton_softmax  # noqa
  File "D:\SD Forge\venv\lib\site-packages\xformers\triton\softmax.py", line 11, in <module>
    import triton
ModuleNotFoundError: No module named 'triton'
xformers version: 0.0.25.dev741
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3070 : native
VAE dtype: torch.bfloat16
Traceback (most recent call last):
  File "D:\SD Forge\launch.py", line 51, in <module>
    main()
  File "D:\SD Forge\launch.py", line 47, in main
    start()
  File "D:\SD Forge\modules\launch_utils.py", line 541, in start
    import webui
  File "D:\SD Forge\webui.py", line 19, in <module>
    initialize.imports()
  File "D:\SD Forge\modules\initialize.py", line 53, in imports
    from modules import processing, gradio_extensons, ui  # noqa: F401
  File "D:\SD Forge\modules\processing.py", line 18, in <module>
    import modules.sd_hijack
  File "D:\SD Forge\modules\sd_hijack.py", line 5, in <module>
    from modules import devices, sd_hijack_optimizations, shared, script_callbacks, errors, sd_unet, patches
  File "D:\SD Forge\modules\sd_hijack_optimizations.py", line 13, in <module>
    from modules.hypernetworks import hypernetwork
  File "D:\SD Forge\modules\hypernetworks\hypernetwork.py", line 13, in <module>
    from modules import devices, sd_models, shared, sd_samplers, hashes, sd_hijack_checkpoint, errors
  File "D:\SD Forge\modules\sd_models.py", line 20, in <module>
    from modules_forge import forge_loader
  File "D:\SD Forge\modules_forge\forge_loader.py", line 5, in <module>
    from ldm_patched.modules import model_detection
  File "D:\SD Forge\ldm_patched\modules\model_detection.py", line 5, in <module>
    import ldm_patched.modules.supported_models
  File "D:\SD Forge\ldm_patched\modules\supported_models.py", line 5, in <module>
    from . import model_base
  File "D:\SD Forge\ldm_patched\modules\model_base.py", line 6, in <module>
    from ldm_patched.ldm.modules.diffusionmodules.openaimodel import UNetModel, Timestep
  File "D:\SD Forge\ldm_patched\ldm\modules\diffusionmodules\openaimodel.py", line 22, in <module>
    from ..attention import SpatialTransformer, SpatialVideoTransformer, default
  File "D:\SD Forge\ldm_patched\ldm\modules\attention.py", line 21, in <module>
    import xformers
ModuleNotFoundError: import of xformers halted; None in sys.modules
Press any key to continue . . .

Additional information

I'm using a copied VENV from SD.Next. PyTorch is nightly, for CUDA 12.1.

zrlu commented 9 months ago

Hi, I had the same issue but I checkout to the forge branch from an old version, could you try:

  1. In PowerShell: .\venv\Scripts\Activate.ps1 pip uninstall xformers

  2. Restart WebUI

dimtoneff commented 9 months ago

set COMMANDLINE_ARGS=--xformers

in webui-user.bat & restart forge

emompi commented 8 months ago

set COMMANDLINE_ARGS=--xformers

in webui-user.bat & restart forge

This fixed it for me, thanks!

hansolocambo commented 8 months ago

Do we need to activate xformers with Forge?

nyukers commented 8 months ago

set COMMANDLINE_ARGS=--xformers

in webui-user.bat & restart forge

No, it's not working.

nyukers commented 8 months ago

Do we need to activate xformers with Forge?

With xformers hangs, without xformers it crashes at the start.

aunymoons commented 7 months ago

The proposed solution is not working either, i cant do any kind of inference, it just fails as

TypeError: 'NoneType' object is not iterable

Zueuk commented 7 months ago

set COMMANDLINE_ARGS=--xformers

in webui-user.bat & restart forge

But I don't want to use xformers

Dreamz-Dziner commented 6 months ago

set COMMANDLINE_ARGS=--xformers in webui-user.bat & restart forge

But I don't want to use xformers

I was able to uninstall xformers from forge using the follow steps

  1. Firstly remove the --xformers command from .bat file which you might have already done.
  2. Go to your forge main folder and try to delete xformers from the site packages " \system\python\Lib\site-packages" Hopefully now you can run sd forge without any xformers error. Cheers!