lshqqytiger / stable-diffusion-webui-amdgpu

Stable Diffusion web UI
GNU Affero General Public License v3.0
1.69k stars 178 forks source link

[Bug]: Install error while trying to use --onnx argument. #241

Closed Lialothedestroyer closed 10 months ago

Lialothedestroyer commented 11 months ago

Is there an existing issue for this?

What happened?

Attempted to run with --onnx argument in order to run an olive optimized model, instead get an import error for accelerate.

Steps to reproduce the problem

  1. Navigate to directory with Automatic1111
  2. Run with the argument --onnx

What should have happened?

The program should have launched with the ability to use ONNX models.

Version or Commit where the problem happens

1.5.1

What Python version are you running on ?

Python 3.10.x

What platforms do you use to access the UI ?

Windows

What device are you running WebUI on?

AMD GPUs

Cross attention optimization

Automatic

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

--onnx --backend directml

List of extensions

No

Console logs

webui.bat --onnx --backend directml
venv "C:\stable-diffusion-webui-directml\venv\Scripts\Python.exe"
fatal: No names found, cannot describe anything.
Python 3.10.6 | packaged by conda-forge | (main, Oct 24 2022, 16:02:16) [MSC v.1916 64 bit (AMD64)]
Version: 1.5.1
Commit hash: 9a8a2a47f63c3d9b04c014a715f95d680f461963
Installing requirements for ONNX
Installing onnxruntime
Installing onnxruntime-directml
Launching Web UI with arguments: --onnx --backend directml
Traceback (most recent call last):
  File "C:\stable-diffusion-webui-directml\launch.py", line 39, in <module>
    main()
  File "C:\stable-diffusion-webui-directml\launch.py", line 35, in main
    start()
  File "C:\stable-diffusion-webui-directml\modules\launch_utils.py", line 443, in start
    import webui
  File "C:\stable-diffusion-webui-directml\webui.py", line 47, in <module>
    from modules import paths, timer, import_hook, errors, devices  # noqa: F401
  File "C:\stable-diffusion-webui-directml\modules\paths.py", line 65, in <module>
    import sgm  # noqa: F401
  File "C:\stable-diffusion-webui-directml\repositories\generative-models\sgm\__init__.py", line 2, in <module>
    from .models import AutoencodingEngine, DiffusionEngine
  File "C:\stable-diffusion-webui-directml\repositories\generative-models\sgm\models\__init__.py", line 1, in <module>
    from .autoencoder import AutoencodingEngine
  File "C:\stable-diffusion-webui-directml\repositories\generative-models\sgm\models\autoencoder.py", line 12, in <module>
    from ..modules.diffusionmodules.model import Decoder, Encoder
  File "C:\stable-diffusion-webui-directml\repositories\generative-models\sgm\modules\__init__.py", line 1, in <module>
    from .encoders.modules import GeneralConditioner
  File "C:\stable-diffusion-webui-directml\repositories\generative-models\sgm\modules\encoders\modules.py", line 13, in <module>
    from transformers import (
  File "C:\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\__init__.py", line 26, in <module>
    from . import dependency_versions_check
  File "C:\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\dependency_versions_check.py", line 57, in <module>
    require_version_core(deps[pkg])
  File "C:\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\versions.py", line 117, in require_version_core
    return require_version(requirement, hint)
  File "C:\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\versions.py", line 111, in require_version
    _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)
  File "C:\stable-diffusion-webui-directml\venv\lib\site-packages\transformers\utils\versions.py", line 44, in _compare_versions
    raise ImportError(
ImportError: accelerate>=0.20.3 is required for a normal functioning of this module, but found accelerate==0.18.0.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git main
Press any key to continue . . .

Additional information

Still occurs even after running pip install -u transformers and pip install -u accelerate in an attempt to update the version, everything shows accelerate at version 0.21.0.

lshqqytiger commented 11 months ago

Did you run pip with virtual environment activated?

sooxt98 commented 11 months ago

I actually got it work in latest master commit by changing requirements_onnx.txt to this below and delete the venv folder

transformers
accelerate>=0.20.3
diffusers
onnx
invisible-watermark
optimum

But again im facing another issue with --olive --backend directml --autolaunch image

lshqqytiger commented 11 months ago

You can find a solution in FAQ. #149 If you want to run Olive, pip install torch==1.13.1 torchvision==0.14.1 torch-directml==0.1.13.1.dev230413.

Lialothedestroyer commented 11 months ago

Changing the requirements_onnx.txt with "accelerate>=0.20.3" seems to have fixed it. I had already done pip install torch==1.13.1 torchvision==0.14.1 torch-directml==0.1.13.1.dev230413 prior, and now it launches every time without fail.

lshqqytiger commented 10 months ago

Fixed in 45d7cc11987f266b45072616e981c9c79423f5e8