ltdrdata / ComfyUI-Impact-Pack

Custom nodes pack for ComfyUI This custom node helps to conveniently enhance images through Detector, Detailer, Upscaler, Pipe, and more.
GNU General Public License v3.0
1.85k stars 178 forks source link

DetailerForEachDebugPipe issue - SEGS throwing errors #286

Closed alenknight closed 11 months ago

alenknight commented 11 months ago

when running DetailerForEachDebugPipe - getting an error :

Error occurred when executing DetailerForEachDebugPipe:

'NoneType' object has no attribute 'shape'

File "C:\AI\ComfyUI\ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "C:\AI\ComfyUI\ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "C:\AI\ComfyUI\ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 1155, in doit
DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps, cfg,
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 206, in do_detail
enhanced_pil, cnet_pil = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 269, in enhance_detail
refined_latent = ksampler_wrapper(model, seed, steps, cfg, sampler_name, scheduler, positive, negative,
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 72, in ksampler_wrapper
nodes.KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "C:\AI\ComfyUI\ComfyUI\nodes.py", line 1237, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "C:\AI\ComfyUI\ComfyUI\nodes.py", line 1207, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 126, in animatediff_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 129, in KSampler_sample
return _KSampler_sample(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 692, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler(), sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 138, in sample
return _sample(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 598, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 558, in sample
samples = getattr(k_diffusion_sampling, "sample_{}".format(sampler_name))(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **extra_options)
File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\k_diffusion\sampling.py", line 137, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 275, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 265, in forward
return self.apply_model(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 841, in apply_model
out = super().apply_model(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 262, in apply_model
out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 250, in sampling_function
cond, uncond = calc_cond_uncond_batch(model, cond, uncond, x, timestep, model_options)
File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 205, in calc_cond_uncond_batch
c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond))
File "C:\AI\ComfyUI\ComfyUI\comfy\controlnet.py", line 166, in get_control
control = self.control_model(x=x_noisy.to(self.control_model.dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(self.control_model.dtype), y=y)
File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "C:\AI\ComfyUI\ComfyUI\comfy\cldm\cldm.py", line 294, in forward
assert y.shape[0] == x.shape[0]

Close
Queue size: 0⚙️🖼️
Queue Prompt
Extra options
Queue FrontView QueueView History
Save
Load
Refresh
Clipspace
Clear
Load Default
Manager
Share
Idle
perfect hand
text, watermark, embedding:negative_hand-neg, deformed, cartoon, painting, (lineart:1.2)
wildcard spec: if kept empty, this option will be ignored
wildcard spec: if kept empty, this option will be ignored

here's the terminal full output


C:\AI\ComfyUI>.\python_embeded\python.exe -s ComfyUI\main.py --listen --windows-standalone-build
** ComfyUI start up time: 2023-11-16 13:26:08.013986

Prestartup times for custom nodes:
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager

Total VRAM 24576 MB, total RAM 32703 MB
xformers version: 0.0.21
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
Using xformers cross attention
Adding extra search path checkpoints C:\AI\stable-diffusion-webui\models/Stable-diffusion
Adding extra search path configs C:\AI\stable-diffusion-webui\models/Stable-diffusion
Adding extra search path vae C:\AI\stable-diffusion-webui\models/VAE
Adding extra search path loras C:\AI\stable-diffusion-webui\models/Lora
Adding extra search path loras C:\AI\stable-diffusion-webui\models/LyCORIS
Adding extra search path upscale_models C:\AI\stable-diffusion-webui\models/ESRGAN
Adding extra search path upscale_models C:\AI\stable-diffusion-webui\models/RealESRGAN
Adding extra search path upscale_models C:\AI\stable-diffusion-webui\models/SwinIR
Adding extra search path embeddings C:\AI\stable-diffusion-webui\embeddings
Adding extra search path hypernetworks C:\AI\stable-diffusion-webui\models/hypernetworks
Adding extra search path controlnet C:\AI\stable-diffusion-webui\models/ControlNet
Error:
[WinError 1314] A required privilege is not held by the client: 'C:\\AI\\ComfyUI\\ComfyUI\\custom_nodes\\ComfyLiterals\\js' -> 'C:\\AI\\ComfyUI\\ComfyUI\\web\\extensions\\ComfyLiterals'
Failed to create symlink to C:\AI\ComfyUI\ComfyUI\web\extensions\ComfyLiterals. Please copy the folder manually.
Source: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyLiterals\js
Target: C:\AI\ComfyUI\ComfyUI\web\extensions\ComfyLiterals
### Loading: ComfyUI-Impact-Pack (V4.30.8)
### Loading: ComfyUI-Impact-Pack (Subpack: V0.3)
### Loading: ComfyUI-Inspire-Pack (V0.44)
### Loading: ComfyUI-Manager (V1.0.1)
### ComfyUI Revision: 1682 [8509bd58] | Released on '2023-11-13'
moviepy is already installed.
cv2 is already installed.
git is already installed.
zipfile is already installed.
skbuild is already installed.

[SD Prompt Reader] Node version: 1.0.1
[SD Prompt Reader] Core version: 1.3.4b2
Failed to auto update `Quality of Life Suit`
QualityOfLifeSuit_Omar92_DIR: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar92
Total VRAM 24576 MB, total RAM 32703 MB
xformers version: 0.0.21
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
Torch version: 2.0.1+cu118
Comfyroll Custom Nodes: Loaded
Davemane42 Custom Nodes: Loaded
FizzleDorf Custom Nodes: Loaded
Using xformers cross attention
[tinyterraNodes] Loaded
Efficiency Nodes: Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...Success!
Efficiency Nodes: Attempting to add 'AnimatedDiff Script' Node (ComfyUI-AnimateDiff-Evolved add-on)...Success!
Total VRAM 24576 MB, total RAM 32703 MB
xformers version: 0.0.21
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
VAE dtype: torch.bfloat16
C:\AI\ComfyUI\ComfyUI\custom_nodes\failfast-comfyui-extensions\extensions
C:\AI\ComfyUI\ComfyUI\web\extensions\failfast-comfyui-extensions
[Power Noise Suite]: 🦚🦚🦚 Cock-a-doodle-dooo! 🦚🦚🦚
[Power Noise Suite]: Tamed 11 wild nodes.

[rgthree] Loaded 15 exciting nodes.
[rgthree] Optimizing ComfyUI recursive execution. If queueing and/or re-queueing seems broken, change "patch_recursive_execution" to false in rgthree_config.json

Searge-SDXL v4.3.1 in C:\AI\ComfyUI\ComfyUI\custom_nodes\SeargeSDXL
WAS Node Suite: BlenderNeko's Advanced CLIP Text Encode found, attempting to enable `CLIPTextEncode` support.
WAS Node Suite: `CLIPTextEncode (BlenderNeko Advanced + NSP)` node enabled under `WAS Suite/Conditioning` menu.
WAS Node Suite: OpenCV Python FFMPEG support is enabled
WAS Node Suite Warning: `ffmpeg_bin_path` is not set in `C:\AI\ComfyUI\ComfyUI\custom_nodes\was-node-suite-comfyui\was_suite_config.json` config file. Will attempt to use system ffmpeg binaries if available.
WAS Node Suite: Finished. Loaded 198 nodes successfully.

        "Success is not just about making money. It's about making a difference." - Unknown

Import times for custom nodes:
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\invert_image_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl_utility.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\invert_mask_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\imageflip_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\monocromatic_clip_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\mosaic_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\brightness_contrast_ally_modified.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\brightness_contrast_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\histogram_equalization.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\crop_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\vae_decode_preview.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\sharpness_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\image_to_mask_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\gaussian_blur_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\saturation_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\image_to_contrast_mask_node.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\Pseudo_HDR_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\gaussian_blur_ally.py
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SDXL-EmptyLatentImage
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl-recommended-res-calc
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\sdxl_prompt_styler
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ADV_CLIP_emb
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_JPS-Nodes
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_restart_sampling
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_IPAdapter_plus
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\stability-ComfyUI-nodes
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_NestedNodeBuilder
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\cg_custom_core
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyLiterals
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_NoxinNodes
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\PowerNoiseSuite
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Dave_CustomNode
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar92
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyMath
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\cg-image-picker
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-VideoHelperSuite
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\failfast-comfyui-extensions
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\efficiency-nodes-comfyui
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\Derfuu_ComfyUI_ModdedNodes
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-prompt-reader-node
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes
   0.0 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\facerestore_cf
   0.1 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack
   0.1 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes
   0.2 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\NodeGPT
   0.2 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack
   0.2 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-reactor-node
   0.2 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_FizzNodes
   0.3 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui_segment_anything
   0.3 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\comfyui-dynamicprompts
   0.3 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager
   0.3 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\clipseg.py
   0.4 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-N-Nodes
   0.6 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_tinyterraNodes
   0.8 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\SeargeSDXL
   1.4 seconds: C:\AI\ComfyUI\ComfyUI\custom_nodes\was-node-suite-comfyui

Starting server

To see the GUI go to: http://0.0.0.0:8188
FETCH DATA from: C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json
QualityOfLifeSuit_Omar92::NSP ready
got prompt

0: 480x640 1 hand, 127.5ms
Speed: 4.0ms preprocess, 127.5ms inference, 61.0ms postprocess per image at shape (1, 3, 480, 640)
[] []
model_type EPS
adm 0
Using xformers attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using xformers attention in VAE
loaded straight to GPU
Requested to load BaseModel
Loading 1 new model
Requested to load SD1ClipModel
Loading 1 new model
Detailer: force inpaint
Detailer: segment upscale for ((1402.7303, 1093.1604)) | crop region (1600, 1200) x 1.0 -> (1600, 1200)

DWPose: Using yolox_l.onnx for bbox detection and dw-ll_ucoco_384.onnx for pose estimation
DWPose: Bbox 1899.41ms
DWPose: Pose 400.17ms
Requested to load ControlNet
Loading 1 new model
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]
ERROR:root:!!! Exception during processing !!!
ERROR:root:Traceback (most recent call last):
  File "C:\AI\ComfyUI\ComfyUI\execution.py", line 153, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "C:\AI\ComfyUI\ComfyUI\execution.py", line 83, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "C:\AI\ComfyUI\ComfyUI\execution.py", line 76, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 1155, in doit
    DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps, cfg,
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 206, in do_detail
    enhanced_pil, cnet_pil = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 269, in enhance_detail
    refined_latent = ksampler_wrapper(model, seed, steps, cfg, sampler_name, scheduler, positive, negative,
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 72, in ksampler_wrapper
    nodes.KSampler().sample(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
  File "C:\AI\ComfyUI\ComfyUI\nodes.py", line 1237, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
  File "C:\AI\ComfyUI\ComfyUI\nodes.py", line 1207, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
    return original_sample(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 126, in animatediff_sample
    return orig_comfy_sample(model, noise, *args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\sample.py", line 100, in sample
    samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 129, in KSampler_sample
    return _KSampler_sample(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 692, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler(), sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 138, in sample
    return _sample(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 598, in sample
    samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 558, in sample
    samples = getattr(k_diffusion_sampling, "sample_{}".format(sampler_name))(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **extra_options)
  File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\k_diffusion\sampling.py", line 137, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
  File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 275, in forward
    out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
  File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 265, in forward
    return self.apply_model(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 841, in apply_model
    out = super().apply_model(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 262, in apply_model
    out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 250, in sampling_function
    cond, uncond = calc_cond_uncond_batch(model, cond, uncond, x, timestep, model_options)
  File "C:\AI\ComfyUI\ComfyUI\comfy\samplers.py", line 205, in calc_cond_uncond_batch
    c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond))
  File "C:\AI\ComfyUI\ComfyUI\comfy\controlnet.py", line 166, in get_control
    control = self.control_model(x=x_noisy.to(self.control_model.dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(self.control_model.dtype), y=y)
  File "C:\AI\ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\AI\ComfyUI\ComfyUI\comfy\cldm\cldm.py", line 294, in forward
    assert y.shape[0] == x.shape[0]
AttributeError: 'NoneType' object has no attribute 'shape'

Prompt executed in 7.80 seconds

and a sample workflow https://drive.google.com/file/d/1aUJTVNaAJO89yCJRYx3F6NdA5Noj-dkh/view?usp=drive_link

ltdrdata commented 11 months ago

I still cannot access that link.

alenknight commented 11 months ago

sorry about that... I had it accidentally set as restricted. shared to public now. https://drive.google.com/file/d/1aUJTVNaAJO89yCJRYx3F6NdA5Noj-dkh/view?usp=share_link

ltdrdata commented 11 months ago

I tested ComfyUI and all custom nodes in their latest versions, and there don't seem to be any issues. Have you also updated the ControlNet Aux extension to the latest version?

alenknight commented 11 months ago

hmmm... I tested it on a new install now... same issue. however, I also noticed I wasn't having onnxruntime-gpu installed... so I did that now. the error still occurs... but comes back quicker. I tested dwpose and it's working too (I saw many folks posting same error about dwpose and onnxruntime recently).

I also posted the error @ https://github.com/Fannovel16/comfyui_controlnet_aux/issues/121 - as It does seem to be an issue with how this node is playing with controlnet_aux. though even a new install has same issues

ltdrdata commented 11 months ago

hmmm... I tested it on a new install now... same issue. however, I also noticed I wasn't having onnxruntime-gpu installed... so I did that now. the error still occurs... but comes back quicker. I tested dwpose and it's working too (I saw many folks posting same error about dwpose and onnxruntime recently).

I also posted the error @ Fannovel16/comfyui_controlnet_aux#121 - as It does seem to be an issue with how this node is playing with controlnet_aux. though even a new install has same issues

Try change once the ControlNet model and other models. The only thing I changed is the models in the workflow.

alenknight commented 11 months ago

Oh I tried changing to face model and a few of them. All same issue with detailerforeachdebugpipe.

for you it runs just fine? Do you have a workflow I could test? Maybe I’m doing something wrong. or does my workflow work fine for you?

(I could also try it on another machine later)

ltdrdata commented 11 months ago

Oh I tried changing to face model and a few of them. All same issue with detailerforeachdebugpipe.

for you it runs just fine? Do you have a workflow I could test? Maybe I’m doing something wrong. or does my workflow work fine for you?

(I could also try it on another machine later)

I replaced only the image and models in your workflow. Since it's a workflow that should involve hands, I chose a image that include hands.

I see ComfyUI_smZNodes in the log. Would you like to disable it and test?

alenknight commented 11 months ago

ahhhh you solved it! :) actually when you said "change models".... i went and looked again... i was running SDXL controlnet with SD15 ckpt models. sorry for confusion.

obviously i should have checked that before, but for some reason it slipped by me. thanks again! works great!