invoke-ai / InvokeAI

Invoke is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, and serves as the foundation for multiple commercial products.
https://invoke-ai.github.io/InvokeAI/
Apache License 2.0
23.64k stars 2.43k forks source link

[BUG]: Manual InvokeAI installation fails on metadata (pyproject.toml) #6717

Closed deffcolony closed 3 months ago

deffcolony commented 3 months ago

Is there an existing issue for this problem?

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

RTX 4090

GPU VRAM

24

Version number

N/A

Browser

N/A

Python dependencies

No response

What happened

During manual install of InvokeAI the following error happend: Preparing metadata (pyproject.toml) did not run successfully.

Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu121
Collecting InvokeAI
  Downloading InvokeAI-4.2.7-py3-none-any.whl.metadata (27 kB)
Collecting accelerate==0.30.1 (from InvokeAI)
  Downloading accelerate-0.30.1-py3-none-any.whl.metadata (18 kB)
Collecting clip-anytorch==2.6.0 (from InvokeAI)
  Downloading clip_anytorch-2.6.0-py3-none-any.whl.metadata (8.4 kB)
Collecting compel==2.0.2 (from InvokeAI)
  Downloading compel-2.0.2-py3-none-any.whl.metadata (12 kB)
Collecting controlnet-aux==0.0.7 (from InvokeAI)
  Downloading controlnet_aux-0.0.7.tar.gz (202 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 202.4/202.4 kB 6.2 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting diffusers==0.27.2 (from diffusers[torch]==0.27.2->InvokeAI)
  Downloading diffusers-0.27.2-py3-none-any.whl.metadata (18 kB)
Collecting invisible-watermark==0.2.0 (from InvokeAI)
  Downloading invisible_watermark-0.2.0-py3-none-any.whl.metadata (8.2 kB)
Collecting mediapipe==0.10.7 (from InvokeAI)
  Downloading mediapipe-0.10.7-cp311-cp311-win_amd64.whl.metadata (9.8 kB)
Collecting numpy==1.26.4 (from InvokeAI)
  Using cached numpy-1.26.4-cp311-cp311-win_amd64.whl.metadata (61 kB)
Collecting onnx==1.15.0 (from InvokeAI)
  Downloading onnx-1.15.0-cp311-cp311-win_amd64.whl.metadata (15 kB)
Collecting onnxruntime==1.16.3 (from InvokeAI)
  Downloading onnxruntime-1.16.3-cp311-cp311-win_amd64.whl.metadata (4.5 kB)
Collecting opencv-python==4.9.0.80 (from InvokeAI)
  Downloading opencv_python-4.9.0.80-cp37-abi3-win_amd64.whl.metadata (20 kB)
Collecting pytorch-lightning==2.1.3 (from InvokeAI)
  Downloading pytorch_lightning-2.1.3-py3-none-any.whl.metadata (21 kB)
Collecting safetensors==0.4.3 (from InvokeAI)
  Using cached safetensors-0.4.3-cp311-none-win_amd64.whl.metadata (3.9 kB)
Collecting spandrel==0.3.4 (from InvokeAI)
  Using cached spandrel-0.3.4-py3-none-any.whl.metadata (14 kB)
Collecting timm==0.6.13 (from InvokeAI)
  Downloading timm-0.6.13-py3-none-any.whl.metadata (38 kB)
Collecting torch==2.2.2 (from InvokeAI)
  Downloading https://download.pytorch.org/whl/cu121/torch-2.2.2%2Bcu121-cp311-cp311-win_amd64.whl (2454.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 GB 3.4 MB/s eta 0:00:00
Collecting torchmetrics==0.11.4 (from InvokeAI)
  Downloading torchmetrics-0.11.4-py3-none-any.whl.metadata (15 kB)
Collecting torchsde==0.2.6 (from InvokeAI)
  Using cached torchsde-0.2.6-py3-none-any.whl.metadata (5.3 kB)
Collecting torchvision==0.17.2 (from InvokeAI)
  Downloading https://download.pytorch.org/whl/cu121/torchvision-0.17.2%2Bcu121-cp311-cp311-win_amd64.whl (5.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 40.1 MB/s eta 0:00:00
Collecting transformers==4.41.1 (from InvokeAI)
  Downloading transformers-4.41.1-py3-none-any.whl.metadata (43 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.8/43.8 kB ? eta 0:00:00
Collecting fastapi-events==0.11.1 (from InvokeAI)
  Downloading fastapi_events-0.11.1-py3-none-any.whl.metadata (19 kB)
Collecting fastapi==0.111.0 (from InvokeAI)
  Using cached fastapi-0.111.0-py3-none-any.whl.metadata (25 kB)
Collecting huggingface-hub==0.23.1 (from InvokeAI)
  Downloading huggingface_hub-0.23.1-py3-none-any.whl.metadata (12 kB)
Collecting pydantic-settings==2.2.1 (from InvokeAI)
  Downloading pydantic_settings-2.2.1-py3-none-any.whl.metadata (3.1 kB)
Collecting pydantic==2.7.2 (from InvokeAI)
  Downloading pydantic-2.7.2-py3-none-any.whl.metadata (108 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 108.5/108.5 kB 6.1 MB/s eta 0:00:00
Collecting python-socketio==5.11.1 (from InvokeAI)
  Downloading python_socketio-5.11.1-py3-none-any.whl.metadata (3.2 kB)
Collecting uvicorn==0.28.0 (from uvicorn[standard]==0.28.0->InvokeAI)
  Downloading uvicorn-0.28.0-py3-none-any.whl.metadata (6.3 kB)
Collecting albumentations (from InvokeAI)
  Using cached albumentations-1.4.12-py3-none-any.whl.metadata (38 kB)
Collecting blake3 (from InvokeAI)
  Downloading blake3-0.4.1-cp311-none-win_amd64.whl.metadata (4.2 kB)
Collecting click (from InvokeAI)
  Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting datasets (from InvokeAI)
  Downloading datasets-2.20.0-py3-none-any.whl.metadata (19 kB)
Collecting Deprecated (from InvokeAI)
  Using cached Deprecated-1.2.14-py2.py3-none-any.whl.metadata (5.4 kB)
Collecting dnspython~=2.4.0 (from InvokeAI)
  Downloading dnspython-2.4.2-py3-none-any.whl.metadata (4.9 kB)
Collecting dynamicprompts (from InvokeAI)
  Downloading dynamicprompts-0.31.0-py3-none-any.whl.metadata (18 kB)
Collecting easing-functions (from InvokeAI)
  Downloading easing_functions-1.0.4-py3-none-any.whl.metadata (1.6 kB)
Collecting einops (from InvokeAI)
  Using cached einops-0.8.0-py3-none-any.whl.metadata (12 kB)
Collecting facexlib (from InvokeAI)
  Downloading facexlib-0.3.0-py3-none-any.whl.metadata (4.6 kB)
Collecting matplotlib (from InvokeAI)
  Downloading matplotlib-3.9.1.tar.gz (36.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 36.1/36.1 MB 38.5 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [22 lines of output]
      + meson setup C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62 C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62\.mesonpy-ufhf7hxh -Dbuildtype=release -Db_ndebug=if-release -Db_vscrt=md --native-file=C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62\.mesonpy-ufhf7hxh\meson-python-native-file.ini
      The Meson build system
      Version: 1.5.1
      Source dir: C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62
      Build dir: C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62\.mesonpy-ufhf7hxh
      Build type: native build
      Program python3 found: YES
      Project name: matplotlib
      Project version: 3.9.1
      C compiler for the host machine: gcc (gcc 9.2.0 "gcc (MinGW.org GCC Build-20200227-1) 9.2.0")
      C linker for the host machine: gcc ld.bfd 2.32
      C++ compiler for the host machine: c++ (gcc 9.2.0 "c++ (MinGW.org GCC Build-20200227-1) 9.2.0")
      C++ linker for the host machine: c++ ld.bfd 2.32
      Host machine cpu family: x86
      Host machine cpu: x86
      Program python found: YES (C:\Users\winuser\miniconda3\envs\invokeai\python.exe)
      Need python for x86, but found x86_64
      Run-time dependency python found: NO (tried sysconfig)

      ..\meson.build:37:14: ERROR: Python dependency not found

      A full log can be found at C:\Users\winuser\AppData\Local\Temp\pip-install-bwjht9bb\matplotlib_14ac6e19721743afb0d215b2b2e2ce62\.mesonpy-ufhf7hxh\meson-logs\meson-log.txt
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

What you expected to happen

that it correctly installs all packages like metadata (pyproject.toml)

How to reproduce the problem

  1. conda create -n invokeai python=3.11 -y
  2. conda activate invokeai
  3. pip install InvokeAI --use-pep517 --extra-index-url https://download.pytorch.org/whl/cu121

Additional context

I am using miniconda3 to install InvokeAI

Discord username

No response

StAlKeR7779 commented 3 months ago

It's problem from matplotlib, https://github.com/matplotlib/matplotlib/issues/28551#issuecomment-2267223102 Already known, later will be fixed, maybe will be pinned for now to matplotlib==3.9.0.

Coumodo commented 3 months ago

is there any workaround?

tacaswell commented 3 months ago

The 3.9.1 wheels on windows would, depending on import order, cause segfaults, see https://github.com/matplotlib/matplotlib/issues/28551#issuecomment-2266699619.

We deleted the bad wheels to avoid segfaults (which seemed like a good idea) but failed to anticipate that this would cause downstream projects to start trying to build mpl from source instead of pulling the 3.9.0 wheel.

Sorry for the trouble.


Rather than setting a cap or hard-pinning to 3.9.0 (unless you already pin everything to exact versions) is to exclude 3.9.1 (matplotlib!=3.9.1) and/or add --only-binary matplotlib to your dependencies.

psychedelicious commented 3 months ago

@tacaswell Thanks! No worries, it's not a big deal - easy fix. Thanks for your work on matplotlib.

psychedelicious commented 3 months ago

@deffcolony @Coumodo We've released v4.2.7post1 that should resolve the issue.