Fannovel16 / comfyui_controlnet_aux

ComfyUI's ControlNet Auxiliary Preprocessors
Apache License 2.0
2.04k stars 201 forks source link

DWPose need only 'onnxruntime-gpu' (without 'onnxruntime') to run GPU. #242

Open tomudo opened 7 months ago

tomudo commented 7 months ago

I tested reinstall 'onnxruntime' / 'onnxruntime-gpu' several times to run DWPose preprocessor on GPU.

It should uninstall both 'onnxruntime' / 'onnxruntime-gpu' and install only 'onnxruntime-gpu' to successfully run DWPose preprocessor node on GPU.

There is a caveat that some nodes rembg need 'onnxruntime' to properly run.

Layer-norm commented 7 months ago

actually, rembg support gpu. you can find that from rembg and rembg-comfyui-node readme. so uninstall rembg frist, then install rembg[gpu] . that would work.

frankchieng commented 7 months ago

actually, rembg support gpu. you can find that from rembg and rembg-comfyui-node readme. so uninstall rembg frist, then install rembg[gpu] . that would work.

actually pip install rembg[gpu] will install onnxruntime by default,so it's a conflict with DWPose and rembg installation prerequisition

Layer-norm commented 7 months ago

actually pip install rembg[gpu] will install onnxruntime by default,so it's a conflict with DWPose and rembg installation prerequisition

@frankchieng ok, but I didn't see any conflict in my computer with onnxruntime==1.16.3 and onnxruntime-gpu==1.16.3 。 onnxruntime Besides, my pytorch is 2.2.0 and my cuda is 12.1 torch2024-02-06 165334 maybe you can try these version.

and here is my test workflow: rembg

Layer-norm commented 7 months ago

OK, I think I know what happened. There seems some conflicts between latest onnxruntime and onnxruntime-gpu (version 1.17.0)

if you have already installed both of them, here is the solution: first uninstall both onnxruntime-gpu and onnxruntime then install onnxruntime only after that install onnxruntime-gpu

you can test your onnx with these code below in the comfyui python environment:

import onnxruntime
onnxruntime.get_available_providers()

if the out put is ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] or something like that. Congratulations to you on your success.

Otherwise like ['AzureExecutionProvider', 'CPUExecutionProvider'] you failed and your onnxruntime-gpu won't work

frankchieng commented 7 months ago

OK, I think I know what happened. There seems some conflicts between latest onnxruntime and onnxruntime-gpu (version 1.17.0)

if you have already installed both of them, here is the solution: first uninstall both onnxruntime-gpu and onnxruntime then install onnxruntime only after that install onnxruntime-gpu

you can test your onnx with these code below in the comfyui python environment:

import onnxruntime
onnxruntime.get_available_providers()

if the out put is ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] or something like that. Congratulations to you on your success.

Otherwise like ['AzureExecutionProvider', 'CPUExecutionProvider'] you failed and your onnxruntime-gpu won't work

i can see the CUDAExecutionProvider ,but can't load the dwpose node it's weird,when i pip uninstall both onnxruntime and onnxruntime-gpu,the dwpose node appears again