cozymantis / human-parser-comfyui-node

A ComfyUI node to automatically extract masks for body regions and clothing/fashion items. Made with 💚 by the CozyMantis squad.
https://cozymantis.gumroad.com
GNU General Public License v3.0
77 stars 11 forks source link

RuntimeError: Ninja is required to load C++ extensions #3

Closed MoonMoon82 closed 8 months ago

MoonMoon82 commented 8 months ago

Hi!

I installed human-parser-comfyui-node through ComfyUI manager and as soon as I run ComfyUI the following error messages appear:

e:\StableDiffusion\ComfyUI_windows_portable>.\python_embeded\python.exe -s ComfyUI\main.py --windows-standalone-build --listen --disable-auto-launch --preview-method auto
[ComfyUI-Manager] Logging failed: [WinError 32] Der Prozess kann nicht auf die Datei zugreifen, da sie von einem anderen Prozess verwendet wird: 'comfyui.log' -> 'comfyui.prev.log'
** ComfyUI startup time: 2024-03-10 09:43:15.496284
** Platform: Windows
** Python version: 3.11.6 (tags/v3.11.6:8b6ee5b, Oct  2 2023, 14:57:12) [MSC v.1935 64 bit (AMD64)]
** Python executable: e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\python.exe
** Log path: e:\StableDiffusion\ComfyUI_windows_portable\comfyui.log

Prestartup times for custom nodes:
   0.0 seconds: E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
   0.0 seconds: E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager

Total VRAM 24575 MB, total RAM 65459 MB
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
Using pytorch cross attention
[Crystools INFO] Crystools version: 1.11.0
[Crystools INFO] CPU: AMD Ryzen 9 5950X 16-Core Processor - Arch: AMD64 - OS: Windows 10
[Crystools INFO] GPU/s:
[Crystools INFO] 0) NVIDIA GeForce RTX 3090
[Crystools INFO] NVIDIA Driver: 551.23

..
..

Traceback (most recent call last):
  File "e:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1887, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\__init__.py", line 1, in <module>
    from .HumanParserLIPCustomNode import HumanParserLIPCustomNode
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\HumanParserLIPCustomNode.py", line 5, in <module>
    from .utils import generate
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\utils.py", line 8, in <module>
    from .schp import networks
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\__init__.py", line 3, in <module>
    from .AugmentCE2P import resnet101
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\AugmentCE2P.py", line 21, in <module>
    from ..modules import InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\__init__.py", line 1, in <module>
    from .bn import ABN, InPlaceABN, InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\bn.py", line 10, in <module>
    from .functions import *
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\functions.py", line 12, in <module>
    _backend = load(name="inplace_abn",
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
           ^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile
    _write_ninja_file_and_build_library(
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1793, in _write_ninja_file_and_build_library
    verify_ninja_availability()
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1842, in verify_ninja_availability
    raise RuntimeError("Ninja is required to load C++ extensions")
RuntimeError: Ninja is required to load C++ extensions

Cannot import E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node module for custom nodes: Ninja is required to load C++ extensions

As you can see, I'm running the windows portable version of ComfyUI and I already installed Ninja in the built-in python environment:

E:\StableDiffusion\ComfyUI_windows_portable\python_embeded>python.exe -m pip install ninja
Requirement already satisfied: ninja in e:\stablediffusion\comfyui_windows_portable\python_embeded\lib\site-packages (1.11.1.1)

I'm not sure how to proceed to run human-parser-comfyui-node properly. I would really appreciate if you have any idea how to solve this issue!

Kind regards!

jandolina commented 8 months ago

I am getting issues with Ninja as well. It says that the requirement is already met. How can I fix this?

MoonMoon82 commented 8 months ago

@jandolina I have still no idea how to solve it. I'm not sure if this repo is still alive :(

jandolina commented 8 months ago

I am using the Clothes Swapper. As part of that workflow, there's a face swapper. If you don't need the face swap, I think the Human Parser is left out.

offmybach commented 8 months ago

I am getting issues with Ninja as well. It says that the requirement is already met. How can I fix this?

I second that.

offmybach commented 8 months ago

Hi!

I installed human-parser-comfyui-node through ComfyUI manager and as soon as I run ComfyUI the following error messages appear:

e:\StableDiffusion\ComfyUI_windows_portable>.\python_embeded\python.exe -s ComfyUI\main.py --windows-standalone-build --listen --disable-auto-launch --preview-method auto
[ComfyUI-Manager] Logging failed: [WinError 32] Der Prozess kann nicht auf die Datei zugreifen, da sie von einem anderen Prozess verwendet wird: 'comfyui.log' -> 'comfyui.prev.log'
** ComfyUI startup time: 2024-03-10 09:43:15.496284
** Platform: Windows
** Python version: 3.11.6 (tags/v3.11.6:8b6ee5b, Oct  2 2023, 14:57:12) [MSC v.1935 64 bit (AMD64)]
** Python executable: e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\python.exe
** Log path: e:\StableDiffusion\ComfyUI_windows_portable\comfyui.log

Prestartup times for custom nodes:
   0.0 seconds: E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
   0.0 seconds: E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager

Total VRAM 24575 MB, total RAM 65459 MB
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
Using pytorch cross attention
[Crystools INFO] Crystools version: 1.11.0
[Crystools INFO] CPU: AMD Ryzen 9 5950X 16-Core Processor - Arch: AMD64 - OS: Windows 10
[Crystools INFO] GPU/s:
[Crystools INFO] 0) NVIDIA GeForce RTX 3090
[Crystools INFO] NVIDIA Driver: 551.23

..
..

Traceback (most recent call last):
  File "e:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1887, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\__init__.py", line 1, in <module>
    from .HumanParserLIPCustomNode import HumanParserLIPCustomNode
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\HumanParserLIPCustomNode.py", line 5, in <module>
    from .utils import generate
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\utils.py", line 8, in <module>
    from .schp import networks
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\__init__.py", line 3, in <module>
    from .AugmentCE2P import resnet101
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\AugmentCE2P.py", line 21, in <module>
    from ..modules import InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\__init__.py", line 1, in <module>
    from .bn import ABN, InPlaceABN, InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\bn.py", line 10, in <module>
    from .functions import *
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\functions.py", line 12, in <module>
    _backend = load(name="inplace_abn",
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
           ^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile
    _write_ninja_file_and_build_library(
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1793, in _write_ninja_file_and_build_library
    verify_ninja_availability()
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1842, in verify_ninja_availability
    raise RuntimeError("Ninja is required to load C++ extensions")
RuntimeError: Ninja is required to load C++ extensions

Cannot import E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node module for custom nodes: Ninja is required to load C++ extensions

As you can see, I'm running the windows portable version of ComfyUI and I already installed Ninja in the built-in python environment:

E:\StableDiffusion\ComfyUI_windows_portable\python_embeded>python.exe -m pip install ninja
Requirement already satisfied: ninja in e:\stablediffusion\comfyui_windows_portable\python_embeded\lib\site-packages (1.11.1.1)

I'm not sure how to proceed to run human-parser-comfyui-node properly. I would really appreciate if you have any idea how to solve this issue!

Kind regards!

same issue here

gabidobo commented 8 months ago

Hi everyone, I'm looking into this but it's difficult to test things out since I don't have easy access to a Windows machine.

But it basically looks like Windows can't find the "ninja.exe" file. I'm guessing that:

Once you locate ninja.exe, I think there are two options:

A quick google gave me this https://www.mathworks.com/matlabcentral/answers/94933-how-do-i-edit-my-system-path-in-windows but I bet there's better tutorials out there too. Remember, you need to enter the path to the folder containing the ninja.exe binary.

You could also just:

I'll try to get access to a Windows machine asap, meanwhile let me know if any of the above helps!

MoonMoon82 commented 8 months ago

I added the ninja.exe to the path environment variable. After "reading" a lot of .h-header files I ran into this error:

C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\bin\nvcc --generate-dependencies-with-compile --dependency-output inplace_abn_cuda_half.cuda.o.d -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcompiler /EHsc -Xcompiler /wd4068 -Xcompiler /wd4067 -Xcompiler /wd4624 -Xcompiler /wd4190 -Xcompiler /wd4018 -Xcompiler /wd4275 -Xcompiler /wd4267 -Xcompiler /wd4244 -Xcompiler /wd4251 -Xcompiler /wd4819 -Xcompiler /MD -DTORCH_EXTENSION_NAME=inplace_abn_v1 -DTORCH_API_INCLUDE_EXTENSION_H -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\TH -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\include" -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Include -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++17 --expt-extended-lambda -c E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\src\inplace_abn_cuda_half.cu -o inplace_abn_cuda_half.cuda.o
inplace_abn_cuda_half.cu
[4/5] C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\bin\nvcc --generate-dependencies-with-compile --dependency-output inplace_abn_cuda.cuda.o.d -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcompiler /EHsc -Xcompiler /wd4068 -Xcompiler /wd4067 -Xcompiler /wd4624 -Xcompiler /wd4190 -Xcompiler /wd4018 -Xcompiler /wd4275 -Xcompiler /wd4267 -Xcompiler /wd4244 -Xcompiler /wd4251 -Xcompiler /wd4819 -Xcompiler /MD -DTORCH_EXTENSION_NAME=inplace_abn_v1 -DTORCH_API_INCLUDE_EXTENSION_H -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\TH -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\include" -Ie:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Include -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++17 --expt-extended-lambda -c E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\src\inplace_abn_cuda.cu -o inplace_abn_cuda.cuda.o
inplace_abn_cuda.cu
[5/5] "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30037\bin\Hostx64\x64/link.exe" inplace_abn.o inplace_abn_cpu.o inplace_abn_cuda.cuda.o inplace_abn_cuda_half.cuda.o /nologo /DLL c10.lib c10_cuda.lib torch_cpu.lib torch_cuda.lib -INCLUDE:?warp_size@cuda@at@@YAHXZ torch.lib /LIBPATH:e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\lib torch_python.lib /LIBPATH:e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\libs "/LIBPATH:C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\lib\x64" cudart.lib /out:inplace_abn_v1.pyd
FAILED: inplace_abn_v1.pyd
"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.29.30037\bin\Hostx64\x64/link.exe" inplace_abn.o inplace_abn_cpu.o inplace_abn_cuda.cuda.o inplace_abn_cuda_half.cuda.o /nologo /DLL c10.lib c10_cuda.lib torch_cpu.lib torch_cuda.lib -INCLUDE:?warp_size@cuda@at@@YAHXZ torch.lib /LIBPATH:e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\lib torch_python.lib /LIBPATH:e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\libs "/LIBPATH:C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\lib\x64" cudart.lib /out:inplace_abn_v1.pyd

inplace_abn.o : fatal error LNK1000: Internal error during IMAGE::Pass1
ninja: build stopped: subcommand failed.

After I read that a Visual Studio 2019 installation could cause that issue, I deinstalled 2019 and installed 2022. Now I'm getting this error:

Traceback (most recent call last):
  File "e:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1888, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\__init__.py", line 1, in <module>
    from .HumanParserLIPCustomNode import HumanParserLIPCustomNode
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\HumanParserLIPCustomNode.py", line 5, in <module>
    from .utils import generate
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\utils.py", line 8, in <module>
    from .schp import networks
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\__init__.py", line 3, in <module>
    from .AugmentCE2P import resnet101
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\AugmentCE2P.py", line 21, in <module>
    from ..modules import InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\__init__.py", line 1, in <module>
    from .bn import ABN, InPlaceABN, InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\bn.py", line 10, in <module>
    from .functions import *
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\functions.py", line 12, in <module>
    _backend = load(name="inplace_abn",
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
           ^^^^^^^^^^^^^
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile
    _write_ninja_file_and_build_library(
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1810, in _write_ninja_file_and_build_library
    _write_ninja_file_to_build_library(
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 2238, in _write_ninja_file_to_build_library
    _write_ninja_file(
  File "e:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 2373, in _write_ninja_file
    cl_paths = subprocess.check_output(['where',
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "subprocess.py", line 466, in check_output
  File "subprocess.py", line 571, in run
subprocess.CalledProcessError: Command '['where', 'cl']' returned non-zero exit status 1.

Cannot import E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node module for custom nodes: Command '['where', 'cl']' returned non-zero exit status 1.
gabidobo commented 8 months ago

Ok so now it looks like it can't locate "cl.exe" which is the compiler/linker tool: https://learn.microsoft.com/en-us/cpp/build/reference/compiler-options?view=msvc-170

You can start this tool only from a Visual Studio developer command prompt. You cannot start it from a system command prompt or from File Explorer. For more information, see Use the MSVC toolset from the command line.

Can you please make sure you've installed all of the things highlighted below?

mvsc1

Then, it looks like you'll need to start ComfyUI from the developer command prompt instead of the regular cmd. Here's docs on how to launch the dev command prompt: https://learn.microsoft.com/en-us/visualstudio/ide/reference/command-prompt-powershell?view=vs-2022

You'll want to run something similar to:

cd X:\path\to\comfy
python main.py
MoonMoon82 commented 8 months ago

I installed the components you marked and startet VS2022 commandline und ran .\python_embeded\python.exe -s ComfyUI\main.py in it.

But then a huge list of "reading of header files" appears and somewhere in the middle some error message like this:

.
.
.
[3/5] C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\bin\nvcc --generate-dependencies-with-compile --dependency-output inplace_abn_cuda.cuda.o.d -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcompiler /EHsc -Xcompiler /wd4068 -Xcompiler /wd4067 -Xcompiler /wd4624 -Xcompiler /wd4190 -Xcompiler /wd4018 -Xcompiler /wd4275 -Xcompiler /wd4267 -Xcompiler /wd4244 -Xcompiler /wd4251 -Xcompiler /wd4819 -Xcompiler /MD -DTORCH_EXTENSION_NAME=inplace_abn -DTORCH_API_INCLUDE_EXTENSION_H -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\TH -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\include" -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Include -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++17 --expt-extended-lambda -c E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\src\inplace_abn_cuda.cu -o inplace_abn_cuda.cuda.o
FAILED: inplace_abn_cuda.cuda.o
C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\bin\nvcc --generate-dependencies-with-compile --dependency-output inplace_abn_cuda.cuda.o.d -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcompiler /EHsc -Xcompiler /wd4068 -Xcompiler /wd4067 -Xcompiler /wd4624 -Xcompiler /wd4190 -Xcompiler /wd4018 -Xcompiler /wd4275 -Xcompiler /wd4267 -Xcompiler /wd4244 -Xcompiler /wd4251 -Xcompiler /wd4819 -Xcompiler /MD -DTORCH_EXTENSION_NAME=inplace_abn -DTORCH_API_INCLUDE_EXTENSION_H -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\TH -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.7\include" -IE:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Include -D_GLIBCXX_USE_CXX11_ABI=0 -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++17 --expt-extended-lambda -c E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\src\inplace_abn_cuda.cu -o inplace_abn_cuda.cuda.o
C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.36.32532\include\vcruntime.h(197): error: invalid redeclaration of type name "size_t"

C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.36.32532\include\vcruntime_new.h(27): error: enum "std::align_val_t" was previously declared with a different base type

C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.36.32532\include\vcruntime_new.h(48): error: first parameter of allocation function must be of type "size_t"
.
.
.

And then at the end it just says

.
.
.
Hinweis: Einlesen der Datei: E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\src\inplace_abn.h
ninja: build stopped: subcommand failed.

I don't even know how to pipe the full message into a textfile so I can have a look at the whole text :(

gabidobo commented 8 months ago

Hmm could you try running in the "x64 Native Tools Command Prompt" instead of the x86 one?

Got the idea from here: https://stackoverflow.com/questions/12843846/problems-when-running-nvcc-from-command-line

MoonMoon82 commented 8 months ago

@gabidobo I don't have any option to choose between x86/x64 commandline

grafik (sorry for the german language, it basically only offers to start the Developer command prompt or Developer PowerShell. I already tried with PowerShell but ended up with the same result.

My prompt commandline looks like this - without any hint regarding x86 or x64: grafik

I assume that it is already x64.

gabidobo commented 8 months ago

Could you please try typing "x64" in the start menu, see if you get something similar to this:

image

MoonMoon82 commented 8 months ago

Ok, I've found the x64 console, but now I get this error message:

E:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\segformer\image_processing_segformer.py:101: FutureWarning: The `reduce_labels` parameter is deprecated and will be removed in a future version. Please use `do_reduce_labels` instead.
  warnings.warn(
Traceback (most recent call last):
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1888, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\__init__.py", line 1, in <module>
    from .HumanParserLIPCustomNode import HumanParserLIPCustomNode
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\HumanParserLIPCustomNode.py", line 5, in <module>
    from .utils import generate
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\utils.py", line 8, in <module>
    from .schp import networks
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\__init__.py", line 3, in <module>
    from .AugmentCE2P import resnet101
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\networks\AugmentCE2P.py", line 21, in <module>
    from ..modules import InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\__init__.py", line 1, in <module>
    from .bn import ABN, InPlaceABN, InPlaceABNSync
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\bn.py", line 10, in <module>
    from .functions import *
  File "E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node\schp\modules\functions.py", line 12, in <module>
    _backend = load(name="inplace_abn",
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
           ^^^^^^^^^^^^^
  File "E:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 1736, in _jit_compile
    return _import_module_from_library(name, build_directory, is_python_module)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\StableDiffusion\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py", line 2132, in _import_module_from_library
    module = importlib.util.module_from_spec(spec)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ImportError: DLL load failed while importing inplace_abn: Das angegebene Modul wurde nicht gefunden.

Cannot import E:\StableDiffusion\ComfyUI_windows_portable\ComfyUI\custom_nodes\human-parser-comfyui-node module for custom nodes: DLL load failed while importing inplace_abn: Das angegebene Modul wurde nicht gefunden.

I found a similar issue in the ootddiffusion repo: https://github.com/levihsu/OOTDiffusion/issues/13 But maybe this is not exactly the issue I am encountering

gabidobo commented 8 months ago

Ok @MoonMoon82 it looks like you got to the error mentioned in #2 - the inplace_abn was properly compiled but it's not loading, for some reason.

@kitkat4947 your error seems to be related to the "OOTDiffusion" node, maybe try posting the issue over at https://github.com/AuroBit/ComfyUI-OOTDiffusion

Going to close this as I think we got to the bottom of the "ninja not found" issue, and will continue debugging other issues under their respective threads.

gabidobo commented 8 months ago

@kitkat4947 in your case it looks like "ninja.exe" cannot be found by the OS, you need to add the parent directory to your system PATH. Check above in this thread for details.

offmybach commented 8 months ago

still same errors here. I'm attempting to run in Stability Matrix and can't just alter the system PATH easily

offmybach commented 8 months ago

Ok @MoonMoon82 it looks like you got to the error mentioned in #2 - the inplace_abn was properly compiled but it's not loading, for some reason.

@kitkat4947 your error seems to be related to the "OOTDiffusion" node, maybe try posting the issue over at https://github.com/AuroBit/ComfyUI-OOTDiffusion

Going to close this as I think we got to the bottom of the "ninja not found" issue, and will continue debugging other issues under their respective threads.

what is "the bottom" of the issue? I don't have OOTdiffusion installed and getting same errors

gabidobo commented 8 months ago

@offmybach not sure about how Stability Matrix works (or if it matters for this issue), but the ninja.exe file needs to be visible to the OS for this node to work - hence why it needs to be in the PATH.

If you can't update your env vars you might not be able to run this. But you could try:

gabidobo commented 7 months ago

@kitkat4947 Copy the following models to the models/schp directory, depending on which parser you would like to use:

gabidobo commented 7 months ago

@kitkat4947 again, it looks like you're in the wrong repo :) your issue is with the OTTDiffusion node, here https://github.com/AuroBit/ComfyUI-OOTDiffusion, you might want to check the issues there. But generally, you want to target the "models" folder of your Comfy installation. PS: Looks like you're also missing the clip vision model.