Open NotolabrusFisicola opened 3 months ago
Your torch version is too old to support fp8, you need to upgrade it.
pip install --upgrade torch torchvision torchaudio
May solve your problem (while in your Forge venv)
Your torch version is too old to support fp8, you need to upgrade it.
pip install --upgrade torch torchvision torchaudio
May solve your problem (while in your Forge venv)
i dont have venv folder in forge, its using a1111's venv folder instead
Your torch version is too old to support fp8, you need to upgrade it. pip install --upgrade torch torchvision torchaudio May solve your problem (while in your Forge venv)
i dont have venv folder in forge, its using a1111's venv folder instead
Well afaik the torch.float8_e4m3fn is only available from PyTorch 2.1 and later.
Not sure sharing the A1111 venv with Forge is the best approach either since the requirements can differ between them.
Intel Arc still uses a custom build of 2.0.110 (launch_utils.py, prepare_environment function). I have updated the installation script to workaround that by referencing a newer build of IPEX libraries by Nuullll (https://github.com/Nuullll/intel-extension-for-pytorch/releases/tag/v2.1.20%2Bmtl%2Boneapi), which helps to see the error gone by bumping torch to 2.1. However, this particular build is a CPU AOT build, so Arc users will see that the first inference will take an ungodly amount of time (for me, a single image w/SDXL, 1024x1024, 20 steps took 8.5 minutes for the first run, 11 seconds for the subsequent ones). Better than not having Forge at all, I suppose.
I will see if I can build a GPU AOT build to work around that - if I do, I will upload the wheels someplace and reach out with a PR. For now, Arc users may use the following patch:
modules/launch_utils.py | 10 +++++-----
1 file changed, 5 insertions(+), 5 deletions(-)
diff --git a/modules/launch_utils.py b/modules/launch_utils.py
index 17202876..b0c48462 100644
--- a/modules/launch_utils.py
+++ b/modules/launch_utils.py
@@ -367,16 +367,16 @@ def prepare_environment():
if platform.system() == "Windows":
# The "Nuullll/intel-extension-for-pytorch" wheels were built from IPEX source for Intel Arc GPU: https://github.com/intel/intel-extension-for-pytorch/tree/xpu-main
# This is NOT an Intel official release so please use it at your own risk!!
- # See https://github.com/Nuullll/intel-extension-for-pytorch/releases/tag/v2.0.110%2Bxpu-master%2Bdll-bundle for details.
+ # See https://github.com/Nuullll/intel-extension-for-pytorch/releases/tag/v2.1.20%2Bmtl%2Boneapi for details.
#
- # Strengths (over official IPEX 2.0.110 windows release):
- # - AOT build (for Arc GPU only) to eliminate JIT compilation overhead: https://github.com/intel/intel-extension-for-pytorch/issues/399
+ # Strengths (over official IPEX 2.1.20 windows release):
+ # - AOT build (for Arc GPU only) to eliminate JIT compilation overhead: https://github.com/intel/intel-extension-for-pytorch/issues/399 [UPD] 2.1.20 build referenced in this commit is build for CPU only, expect first run to be quite slow
# - Bundles minimal oneAPI 2023.2 dependencies into the python wheels, so users don't need to install oneAPI for the whole system.
# - Provides a compatible torchvision wheel: https://github.com/intel/intel-extension-for-pytorch/issues/465
# Limitation:
# - Only works for python 3.10
- url_prefix = "https://github.com/Nuullll/intel-extension-for-pytorch/releases/download/v2.0.110%2Bxpu-master%2Bdll-bundle"
- torch_command = os.environ.get('TORCH_COMMAND', f"pip install {url_prefix}/torch-2.0.0a0+gite9ebda2-cp310-cp310-win_amd64.whl {url_prefix}/torchvision-0.15.2a0+fa99a53-cp310-cp310-win_amd64.whl {url_prefix}/intel_extension_for_pytorch-2.0.110+gitc6ea20b-cp310-cp310-win_amd64.whl")
+ url_prefix = "https://github.com/Nuullll/intel-extension-for-pytorch/releases/download/v2.1.20%2Bmtl%2Boneapi"
+ torch_command = os.environ.get('TORCH_COMMAND', f"pip install {url_prefix}/torch-2.1.0a0+git7bcf7da-cp310-cp310-win_amd64.whl {url_prefix}/torchvision-0.16.0+fbb4cc5-cp310-cp310-win_amd64.whl {url_prefix}/intel_extension_for_pytorch-2.1.20+git4849f3b-cp310-cp310-win_amd64.whl")
else:
# Using official IPEX release for linux since it's already an AOT build.
# However, users still have to install oneAPI toolkit and activate oneAPI environment manually.
I've been pulling updates nearly every day since the revamp, but after today's pull, this error popped up while trying to run the web-ui.
Running with the following COMMANDLINE_ARGS=--xformers --cuda-malloc --cuda-stream Specs: RTX 2060 6GB VRAM