(.venv) D:\ComfyUI\ComfyUI-to-Python-Extension>py comfyui_to_python.py
Traceback (most recent call last):
File "D:\ComfyUI\ComfyUI-to-Python-Extension\comfyui_to_python.py", line 17, in
from nodes import NODE_CLASS_MAPPINGS
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\nodes.py", line 21, in
import comfy.diffusers_load
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\sd.py", line 5, in
from comfy import model_management
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 120, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 89, in get_torch_device
return torch.device(torch.cuda.current_device())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda__init__.py", line 778, in current_device
_lazy_init()
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda__init__.py", line 284, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
I run confyui with the --cpu arg. Like main.py -cpu.
Is there a way to do this with py comfyui_to_python.py as well? Like py comfyui_to_python.py --cpu ?
(.venv) D:\ComfyUI\ComfyUI-to-Python-Extension>py comfyui_to_python.py Traceback (most recent call last): File "D:\ComfyUI\ComfyUI-to-Python-Extension\comfyui_to_python.py", line 17, in
from nodes import NODE_CLASS_MAPPINGS
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\nodes.py", line 21, in
import comfy.diffusers_load
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\diffusers_load.py", line 3, in
import comfy.sd
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\sd.py", line 5, in
from comfy import model_management
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 120, in
total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\ComfyUI-to-Python-Extension..\comfy\model_management.py", line 89, in get_torch_device
return torch.device(torch.cuda.current_device())
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda__init__.py", line 778, in current_device
_lazy_init()
File "D:\ComfyUI.venv\Lib\site-packages\torch\cuda__init__.py", line 284, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
I run confyui with the --cpu arg. Like main.py -cpu. Is there a way to do this with py comfyui_to_python.py as well? Like py comfyui_to_python.py --cpu ?