patientx / ComfyUI-Zluda

The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU performance.
GNU General Public License v3.0
162 stars 11 forks source link

XPU changes broke ComfyUI startup #17

Closed jonatj closed 3 months ago

jonatj commented 3 months ago

Expected Behavior

.\start.bat will load ComfyUI

Actual Behavior

.\start.bat starts to load then fails with error:

Traceback (most recent call last): File "E:\pinokio\api\ComfyUI-Zluda\main.py", line 86, in import execution File "E:\pinokio\api\ComfyUI-Zluda\execution.py", line 13, in import nodes File "E:\pinokio\api\ComfyUI-Zluda\nodes.py", line 21, in import comfy.diffusers_load File "E:\pinokio\api\ComfyUI-Zluda\comfy\diffusers_load.py", line 3, in import comfy.sd File "E:\pinokio\api\ComfyUI-Zluda\comfy\sd.py", line 5, in from comfy import model_management File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 139, in total_vram = get_total_memory(get_torch_device()) / (1024 * 1024) File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 106, in get_torch_device return torch.device("xpu", torch.xpu.current_device()) File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpuinit.py", line 257, in current_device lazy_init() File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu_init.py", line 115, in _lazy_init raise AssertionError("Torch not compiled with XPU enabled") AssertionError: Torch not compiled with XPU enabled

Steps to Reproduce

git pull the latest build and run start.bat

Debug Logs

(venv) (base) PS E:\pinokio\api\ComfyUI-Zluda> .\start.bat
*** Checking and updating to new version if possible
Already up to date.

[START] Security scan
[DONE] Security scan
## ComfyUI-Manager: installing dependencies done.
** ComfyUI startup time: 2024-08-22 23:14:39.166278
** Platform: Windows
** Python version: 3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:40:08) [MSC v.1938 64 bit (AMD64)]
** Python executable: E:\pinokio\api\ComfyUI-Zluda\venv\Scripts\python.exe
** ComfyUI Path: E:\pinokio\api\ComfyUI-Zluda
** Log path: E:\pinokio\api\ComfyUI-Zluda\comfyui.log

Prestartup times for custom nodes:
   0.0 seconds: E:\pinokio\api\ComfyUI-Zluda\custom_nodes\rgthree-comfy
   1.8 seconds: E:\pinokio\api\ComfyUI-Zluda\custom_nodes\ComfyUI-Manager

Traceback (most recent call last):
  File "E:\pinokio\api\ComfyUI-Zluda\main.py", line 86, in <module>
    import execution
  File "E:\pinokio\api\ComfyUI-Zluda\execution.py", line 13, in <module>
    import nodes
  File "E:\pinokio\api\ComfyUI-Zluda\nodes.py", line 21, in <module>
    import comfy.diffusers_load
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\sd.py", line 5, in <module>
    from comfy import model_management
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 139, in <module>
    total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 106, in get_torch_device
    return torch.device("xpu", torch.xpu.current_device())
  File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu\__init__.py", line 257, in current_device
    _lazy_init()
  File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu\__init__.py", line 115, in _lazy_init
    raise AssertionError("Torch not compiled with XPU enabled")
AssertionError: Torch not compiled with XPU enabled

Other

No response

greedy-n5q commented 3 months ago

running fix_torch.py worked for me

patientx commented 3 months ago

Expected Behavior

.\start.bat will load ComfyUI

Actual Behavior

.\start.bat starts to load then fails with error:

Traceback (most recent call last): File "E:\pinokio\api\ComfyUI-Zluda\main.py", line 86, in import execution File "E:\pinokio\api\ComfyUI-Zluda\execution.py", line 13, in import nodes File "E:\pinokio\api\ComfyUI-Zluda\nodes.py", line 21, in import comfy.diffusers_load File "E:\pinokio\api\ComfyUI-Zluda\comfy\diffusers_load.py", line 3, in import comfy.sd File "E:\pinokio\api\ComfyUI-Zluda\comfy\sd.py", line 5, in from comfy import model_management File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 139, in total_vram = get_total_memory(get_torch_device()) / (1024 * 1024) File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 106, in get_torch_device return torch.device("xpu", torch.xpu.current_device()) File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpuinit.py", line 257, in current_device lazy_init() File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu_init.py", line 115, in _lazy_init raise AssertionError("Torch not compiled with XPU enabled") AssertionError: Torch not compiled with XPU enabled

Steps to Reproduce

git pull the latest build and run start.bat

Debug Logs

(venv) (base) PS E:\pinokio\api\ComfyUI-Zluda> .\start.bat
*** Checking and updating to new version if possible
Already up to date.

[START] Security scan
[DONE] Security scan
## ComfyUI-Manager: installing dependencies done.
** ComfyUI startup time: 2024-08-22 23:14:39.166278
** Platform: Windows
** Python version: 3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:40:08) [MSC v.1938 64 bit (AMD64)]
** Python executable: E:\pinokio\api\ComfyUI-Zluda\venv\Scripts\python.exe
** ComfyUI Path: E:\pinokio\api\ComfyUI-Zluda
** Log path: E:\pinokio\api\ComfyUI-Zluda\comfyui.log

Prestartup times for custom nodes:
   0.0 seconds: E:\pinokio\api\ComfyUI-Zluda\custom_nodes\rgthree-comfy
   1.8 seconds: E:\pinokio\api\ComfyUI-Zluda\custom_nodes\ComfyUI-Manager

Traceback (most recent call last):
  File "E:\pinokio\api\ComfyUI-Zluda\main.py", line 86, in <module>
    import execution
  File "E:\pinokio\api\ComfyUI-Zluda\execution.py", line 13, in <module>
    import nodes
  File "E:\pinokio\api\ComfyUI-Zluda\nodes.py", line 21, in <module>
    import comfy.diffusers_load
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\sd.py", line 5, in <module>
    from comfy import model_management
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 139, in <module>
    total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
  File "E:\pinokio\api\ComfyUI-Zluda\comfy\model_management.py", line 106, in get_torch_device
    return torch.device("xpu", torch.xpu.current_device())
  File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu\__init__.py", line 257, in current_device
    _lazy_init()
  File "E:\pinokio\api\ComfyUI-Zluda\venv\lib\site-packages\torch\xpu\__init__.py", line 115, in _lazy_init
    raise AssertionError("Torch not compiled with XPU enabled")
AssertionError: Torch not compiled with XPU enabled

Other

No response

I didn't encounter the problem myself but try this if it solves your problem please reply.

running fix_torch.py worked for me

jonatj commented 3 months ago

Yes, running 'python fix_torch.py' seemed to work for me too. Thank you.