comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
56.63k stars 6.01k forks source link

Problems using the workflow for repairing hands, 'NoneType' object has no attribute 'copy'。 #5323

Open areinius opened 3 weeks ago

areinius commented 3 weeks ago

Your question

Problems using the workflow for repairing hands 使用修复手部的工作流时出现了问题 Here's the link to the workflow 这是工作流的链接 https://civitai.com/models/471894/hand-fixer-or-comfyui-workflow 屏幕截图 2024-10-22 192446

Logs

# ComfyUI Error Report
## Error Details
- **Node Type:** DetailerForEach
- **Exception Type:** AttributeError
- **Exception Message:** 'NoneType' object has no attribute 'copy'
## Stack Trace

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 378, in doit
    DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps,

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 323, in do_detail
    enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 313, in enhance_detail
    positive, negative, cnet_pils = control_net_wrapper.apply(positive, negative, upscaled_image, noise_mask)

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 1888, in apply
    positive, negative = nodes.ControlNetApplyAdvanced().apply_controlnet(positive, negative, self.control_net, cnet_image, self.strength, self.start_percent, self.end_percent)

  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 848, in apply_controlnet
    c_net = control_net.copy().set_cond_hint(control_hint, strength, (start_percent, end_percent), vae=vae, extra_concat=extra_concat)

System Information

Logs

2024-10-22 19:26:22,899 - root - INFO - Total VRAM 8188 MB, total RAM 16116 MB
2024-10-22 19:26:22,900 - root - INFO - pytorch version: 2.1.2+cu118
2024-10-22 19:26:23,249 - root - INFO - xformers version: 0.0.23.post1+cu118
2024-10-22 19:26:23,249 - root - INFO - Set vram state to: NORMAL_VRAM
2024-10-22 19:26:23,249 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4060 Laptop GPU : cudaMallocAsync
2024-10-22 19:26:24,176 - root - INFO - Using xformers cross attention
2024-10-22 19:26:25,249 - root - INFO - [Prompt Server] web root: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\web
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path checkpoints D:\sd-webui-aki-v4.5\webui\models/Stable-diffusion
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path configs D:\sd-webui-aki-v4.5\webui\models/Stable-diffusion
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path vae D:\sd-webui-aki-v4.5\webui\models/VAE
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path loras D:\sd-webui-aki-v4.5\webui\models/Lora
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path loras D:\sd-webui-aki-v4.5\webui\models/LyCORIS
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path upscale_models D:\sd-webui-aki-v4.5\webui\models/ESRGAN
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path upscale_models D:\sd-webui-aki-v4.5\webui\models/RealESRGAN
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path upscale_models D:\sd-webui-aki-v4.5\webui\models/SwinIR
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path embeddings D:\sd-webui-aki-v4.5\webui\embeddings
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path hypernetworks D:\sd-webui-aki-v4.5\webui\models/hypernetworks
2024-10-22 19:26:25,251 - root - INFO - Adding extra search path controlnet D:\sd-webui-aki-v4.5\webui\models/ControlNet
2024-10-22 19:26:44,270 - root - WARNING - Traceback (most recent call last):
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 2001, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\efficiency-nodes-comfyui\__init__.py", line 9, in <module>
    from  .efficiency_nodes import NODE_CLASS_MAPPINGS
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\efficiency-nodes-comfyui\efficiency_nodes.py", line 46, in <module>
    from .py import smZ_cfg_denoiser
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\efficiency-nodes-comfyui\py\smZ_cfg_denoiser.py", line 7, in <module>
    from comfy.samplers import KSampler, KSamplerX0Inpaint, wrap_model
ImportError: cannot import name 'wrap_model' from 'comfy.samplers' (D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py)

2024-10-22 19:26:44,271 - root - WARNING - Cannot import D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\efficiency-nodes-comfyui module for custom nodes: cannot import name 'wrap_model' from 'comfy.samplers' (D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py)
2024-10-22 19:26:44,287 - root - INFO - 
Import times for custom nodes:
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\websocket_image_save.py
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\AIGODLIKE-ComfyUI-Translation
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ControlNet-LLLite-ComfyUI
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\FreeU_Advanced
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\stability-ComfyUI-nodes
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_TiledKSampler
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-WD14-Tagger
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_IPAdapter_plus
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_experiments
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\PowerNoiseSuite
2024-10-22 19:26:44,287 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\images-grid-comfy-plugin
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_UltimateSDUpscale
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Custom-Scripts
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\comfyui-workspace-manager
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Advanced-ControlNet
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\Derfuu_ComfyUI_ModdedNodes
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-AnimateDiff-Evolved
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2024-10-22 19:26:44,288 - root - INFO -    0.0 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Inspire-Pack
2024-10-22 19:26:44,288 - root - INFO -    0.1 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\comfyui_controlnet_aux
2024-10-22 19:26:44,288 - root - INFO -    0.1 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Marigold
2024-10-22 19:26:44,288 - root - INFO -    0.2 seconds (IMPORT FAILED): D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\efficiency-nodes-comfyui
2024-10-22 19:26:44,288 - root - INFO -    0.4 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_FizzNodes
2024-10-22 19:26:44,288 - root - INFO -    0.4 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Manager
2024-10-22 19:26:44,288 - root - INFO -    1.3 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack
2024-10-22 19:26:44,288 - root - INFO -    5.1 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Crystools
2024-10-22 19:26:44,288 - root - INFO -   10.3 seconds: D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Custom_Nodes_AlekPet
2024-10-22 19:26:44,288 - root - INFO - 
2024-10-22 19:26:44,298 - root - INFO - Starting server

2024-10-22 19:26:44,298 - root - INFO - To see the GUI go to: http://127.0.0.1:8188
2024-10-22 19:27:05,103 - root - INFO - got prompt
2024-10-22 19:27:05,608 - root - INFO - model weight dtype torch.float16, manual cast: None
2024-10-22 19:27:05,610 - root - INFO - model_type EPS
2024-10-22 19:27:11,915 - root - INFO - Using xformers attention in VAE
2024-10-22 19:27:11,918 - root - INFO - Using xformers attention in VAE
2024-10-22 19:27:12,437 - root - INFO - Requested to load SDXLClipModel
2024-10-22 19:27:12,437 - root - INFO - Loading 1 new model
2024-10-22 19:27:12,445 - root - INFO - loaded completely 0.0 1560.802734375 True
2024-10-22 19:27:14,685 - root - WARNING - WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 != 1280
2024-10-22 19:27:14,685 - root - WARNING - WARNING: shape mismatch when trying to apply embedding, embedding will be ignored 768 != 1280
2024-10-22 19:27:18,413 - custom_mesh_graphormer.modeling.hrnet.hrnet_cls_net_gridfeat - INFO - => init weights from normal distribution
2024-10-22 19:27:19,226 - custom_mesh_graphormer.modeling.hrnet.hrnet_cls_net_gridfeat - INFO - => loading pretrained model D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\comfyui_controlnet_aux\ckpts\hr16/ControlNet-HandRefiner-pruned\hrnetv2_w64_imagenet_pretrained.pth
2024-10-22 19:27:21,831 - root - INFO - Requested to load AutoencoderKL
2024-10-22 19:27:21,831 - root - INFO - Loading 1 new model
2024-10-22 19:27:21,864 - root - INFO - loaded completely 0.0 159.55708122253418 True
2024-10-22 19:27:22,346 - root - INFO - Requested to load SDXL
2024-10-22 19:27:22,346 - root - INFO - Requested to load ControlNet
2024-10-22 19:27:22,346 - root - INFO - Loading 2 new models
2024-10-22 19:27:24,425 - root - INFO - loaded partially 4734.340386199951 4734.339599609375 0
2024-10-22 19:27:24,439 - root - INFO - loaded partially 64.0 63.99957275390625 0
2024-10-22 19:27:24,702 - root - ERROR - !!! Exception during processing !!! mat1 and mat2 shapes cannot be multiplied (154x2048 and 768x320)
2024-10-22 19:27:24,710 - root - ERROR - Traceback (most recent call last):
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 378, in doit
    DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 323, in do_detail
    enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 348, in enhance_detail
    refined_latent = impact_sampling.ksampler_wrapper(model2, seed2, steps2, cfg2, sampler_name2, scheduler2, positive2, negative2,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 241, in ksampler_wrapper
    refined_latent = separated_sample(model, True, seed, advanced_steps, cfg, sampler_name, scheduler,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 214, in separated_sample
    res = sample_with_custom_noise(model, add_noise, seed, cfg, positive, negative, impact_sampler, sigmas, latent_image, noise=noise, callback=callback)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 158, in sample_with_custom_noise
    samples = comfy.sample.sample_custom(model, noise, cfg, sampler, sigmas, positive, negative, latent_image,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 248, in motion_sample
    return orig_comfy_sample(model, noise, *args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\sample.py", line 48, in sample_custom
    samples = comfy.samplers.sample(model, noise, positive, negative, cfg, model.load_device, sampler, sigmas, model_options=model.model_options, latent_image=latent_image, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 729, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 716, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 695, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 600, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\k_diffusion\sampling.py", line 664, in sample_dpmpp_2m
    denoised = model(x, sigmas[i] * s_in, **extra_args)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 682, in __call__
    return self.predict_noise(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 685, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 202, in calc_cond_batch
    c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond))
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\controlnet.py", line 258, in get_control
    control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.to(dtype), context=context.to(dtype), **extra)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\cldm\cldm.py", line 430, in forward
    h = module(h, emb, context)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 60, in forward
    return forward_timestep_embed(self, *args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 44, in forward_timestep_embed
    x = layer(x, context, transformer_options)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ldm\modules\attention.py", line 694, in forward
    x = block(x, context=context[i], transformer_options=transformer_options)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ldm\modules\attention.py", line 621, in forward
    n = self.attn2(n, context=context_attn2, value=value_attn2)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ldm\modules\attention.py", line 467, in forward
    k = self.to_k(context)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ops.py", line 76, in forward
    return self.forward_comfy_cast_weights(*args, **kwargs)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\ops.py", line 72, in forward_comfy_cast_weights
    return torch.nn.functional.linear(input, weight, bias)
RuntimeError: mat1 and mat2 shapes cannot be multiplied (154x2048 and 768x320)

2024-10-22 19:27:24,711 - root - INFO - Prompt executed in 19.60 seconds
2024-10-22 19:27:30,017 - root - INFO - got prompt
2024-10-22 19:27:30,169 - root - ERROR - error could not detect control model type.
2024-10-22 19:27:30,170 - root - ERROR - error checkpoint does not contain controlnet or t2i adapter data D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\models\controlnet\controlnetxlCNXL_bdsqlszDepth.safetensors
2024-10-22 19:27:30,362 - root - ERROR - !!! Exception during processing !!! 'NoneType' object has no attribute 'copy'
2024-10-22 19:27:30,362 - root - ERROR - Traceback (most recent call last):
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 378, in doit
    DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 323, in do_detail
    enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 313, in enhance_detail
    positive, negative, cnet_pils = control_net_wrapper.apply(positive, negative, upscaled_image, noise_mask)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 1888, in apply
    positive, negative = nodes.ControlNetApplyAdvanced().apply_controlnet(positive, negative, self.control_net, cnet_image, self.strength, self.start_percent, self.end_percent)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 848, in apply_controlnet
    c_net = control_net.copy().set_cond_hint(control_hint, strength, (start_percent, end_percent), vae=vae, extra_concat=extra_concat)
AttributeError: 'NoneType' object has no attribute 'copy'

2024-10-22 19:27:30,363 - root - INFO - Prompt executed in 0.34 seconds
2024-10-22 19:27:37,622 - root - INFO - got prompt
2024-10-22 19:27:37,695 - root - ERROR - !!! Exception during processing !!! 'NoneType' object has no attribute 'copy'
2024-10-22 19:27:37,695 - root - ERROR - Traceback (most recent call last):
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 378, in doit
    DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 323, in do_detail
    enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size,
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 313, in enhance_detail
    positive, negative, cnet_pils = control_net_wrapper.apply(positive, negative, upscaled_image, noise_mask)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 1888, in apply
    positive, negative = nodes.ControlNetApplyAdvanced().apply_controlnet(positive, negative, self.control_net, cnet_image, self.strength, self.start_percent, self.end_percent)
  File "D:\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 848, in apply_controlnet
    c_net = control_net.copy().set_cond_hint(control_hint, strength, (start_percent, end_percent), vae=vae, extra_concat=extra_concat)
AttributeError: 'NoneType' object has no attribute 'copy'

2024-10-22 19:27:37,696 - root - INFO - Prompt executed in 0.07 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":125,"last_link_id":208,"nodes":[{"id":102,"type":"Reroute","pos":{"0":670,"1":660},"size":[82,26],"flags":{},"order":10,"mode":0,"inputs":[{"name":"","type":"*","link":198,"label":""}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[176],"slot_index":0,"label":"IMAGE"}],"properties":{"showOutputText":true,"horizontal":false}},{"id":12,"type":"UltralyticsDetectorProvider","pos":{"0":400,"1":210},"size":{"0":230,"1":80},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"BBOX_DETECTOR","type":"BBOX_DETECTOR","links":[134,182],"slot_index":0,"shape":3,"label":"BBOX_DETECTOR"},{"name":"SEGM_DETECTOR","type":"SEGM_DETECTOR","links":[],"slot_index":1,"shape":3,"label":"SEGM_DETECTOR"}],"properties":{"Node name for S&R":"UltralyticsDetectorProvider"},"widgets_values":["bbox/hand_yolov8s.pt"]},{"id":101,"type":"SaveImage","pos":{"0":1550,"1":660},"size":{"0":340,"1":290},"flags":{},"order":18,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":174,"label":"images"}],"outputs":[],"properties":{},"widgets_values":["Detailer/HandsFixed"]},{"id":104,"type":"Note","pos":{"0":950,"1":1220},"size":{"0":250,"1":60},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["If original hand is really distorted, set \"cycle = 2\"."],"color":"#432","bgcolor":"#653"},{"id":124,"type":"Reroute","pos":{"0":840,"1":720},"size":[75,26],"flags":{},"order":11,"mode":0,"inputs":[{"name":"","type":"*","link":203,"label":""}],"outputs":[{"name":"CLIP","type":"CLIP","links":[204,205,206],"slot_index":0,"label":"CLIP"}],"properties":{"showOutputText":true,"horizontal":false}},{"id":105,"type":"Note","pos":{"0":710,"1":800},"size":{"0":210,"1":80},"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["When needed, mention gestures, colored fingernails, claws or background in your prompt."],"color":"#432","bgcolor":"#653"},{"id":100,"type":"CLIPTextEncode","pos":{"0":950,"1":930},"size":{"0":250,"1":90},"flags":{},"order":15,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":205,"label":"clip"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[171],"slot_index":0,"shape":3,"label":"CONDITIONING"}],"title":"Negative","properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["(embedding:bad-hands-5:0.8), deformed, blurry, unprofessional-bodies"],"color":"#223","bgcolor":"#335"},{"id":125,"type":"PrimitiveNode","pos":{"0":400,"1":80},"size":{"0":210,"1":82},"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"FLOAT","type":"FLOAT","links":[207,208],"slot_index":0,"widget":{"name":"threshold"},"label":"FLOAT"}],"title":"Detection Threshold","properties":{"Run widget replace on values":false},"widgets_values":[0.38,"fixed"],"color":"#223","bgcolor":"#335"},{"id":99,"type":"CLIPTextEncode","pos":{"0":950,"1":800},"size":{"0":250,"1":80},"flags":{},"order":14,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":204,"label":"clip"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[170],"slot_index":0,"shape":3,"label":"CONDITIONING"}],"title":"Prompt","properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["female hand"],"color":"#232","bgcolor":"#353"},{"id":98,"type":"DetailerForEach","pos":{"0":1230,"1":660},"size":{"0":290,"1":620},"flags":{},"order":17,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":176,"label":"image"},{"name":"segs","type":"SEGS","link":186,"label":"segs"},{"name":"model","type":"MODEL","link":166,"label":"model"},{"name":"clip","type":"CLIP","link":206,"label":"clip"},{"name":"vae","type":"VAE","link":168,"label":"vae"},{"name":"positive","type":"CONDITIONING","link":170,"label":"positive"},{"name":"negative","type":"CONDITIONING","link":171,"label":"negative"},{"name":"detailer_hook","type":"DETAILER_HOOK","link":null,"shape":7,"label":"detailer_hook"},{"name":"scheduler_func_opt","type":"SCHEDULER_FUNC","link":null,"shape":7,"label":"scheduler_func_opt"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[174],"slot_index":0,"shape":3,"label":"IMAGE"}],"properties":{"Node name for S&R":"DetailerForEach"},"widgets_values":[384,true,1024,44041207002118,"randomize",30,5,"dpmpp_2m","karras",0.63,5,true,true,"",1,false,10]},{"id":117,"type":"SEGSPreviewCNet","pos":{"0":1230,"1":360},"size":{"0":330,"1":250},"flags":{},"order":16,"mode":0,"inputs":[{"name":"segs","type":"SEGS","link":189,"label":"segs"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":null,"shape":6,"label":"IMAGE"}],"properties":{"Node name for S&R":"SEGSPreviewCNet"},"widgets_values":[]},{"id":110,"type":"ImpactControlNetApplyAdvancedSEGS","pos":{"0":950,"1":360},"size":{"0":252,"1":186},"flags":{},"order":12,"mode":0,"inputs":[{"name":"segs","type":"SEGS","link":181,"label":"segs"},{"name":"control_net","type":"CONTROL_NET","link":180,"slot_index":1,"label":"control_net"},{"name":"segs_preprocessor","type":"SEGS_PREPROCESSOR","link":null,"shape":7,"label":"segs_preprocessor"},{"name":"control_image","type":"IMAGE","link":184,"shape":7,"label":"control_image"},{"name":"vae","type":"VAE","link":null,"shape":7,"label":"vae"}],"outputs":[{"name":"SEGS","type":"SEGS","links":[186,189],"slot_index":0,"shape":3,"label":"SEGS"}],"properties":{"Node name for S&R":"ImpactControlNetApplyAdvancedSEGS"},"widgets_values":[0.6300000000000001,0,1]},{"id":82,"type":"BboxDetectorSEGS","pos":{"0":710,"1":80},"size":{"0":210,"1":210},"flags":{},"order":9,"mode":0,"inputs":[{"name":"bbox_detector","type":"BBOX_DETECTOR","link":134,"label":"bbox_detector"},{"name":"image","type":"IMAGE","link":197,"label":"image"},{"name":"detailer_hook","type":"DETAILER_HOOK","link":null,"shape":7,"label":"detailer_hook"},{"name":"threshold","type":"FLOAT","link":207,"widget":{"name":"threshold"},"label":"threshold"}],"outputs":[{"name":"SEGS","type":"SEGS","links":[181,188],"slot_index":0,"shape":3,"label":"SEGS"}],"properties":{"Node name for S&R":"BboxDetectorSEGS"},"widgets_values":[0.38,40,2.2,15,"all"]},{"id":121,"type":"Note","pos":{"0":950,"1":580},"size":{"0":250,"1":60},"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["If depth maps look wrong, bypass \"ControlNetApplyAdvanced\" (Ctrl + B)"],"color":"#432","bgcolor":"#653"},{"id":116,"type":"SEGSPreview","pos":{"0":1230,"1":0},"size":[330,314],"flags":{},"order":13,"mode":0,"inputs":[{"name":"segs","type":"SEGS","link":188,"label":"segs"},{"name":"fallback_image_opt","type":"IMAGE","link":null,"shape":7,"label":"fallback_image_opt"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":null,"shape":6,"label":"IMAGE"}],"properties":{"Node name for S&R":"SEGSPreview"},"widgets_values":[true,0.2]},{"id":113,"type":"MeshGraphormer+ImpactDetector-DepthMapPreprocessor","pos":{"0":670,"1":360},"size":{"0":253.60000610351562,"1":280},"flags":{},"order":8,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":196,"label":"image"},{"name":"bbox_detector","type":"BBOX_DETECTOR","link":182,"label":"bbox_detector"},{"name":"bbox_threshold","type":"FLOAT","link":208,"widget":{"name":"bbox_threshold"},"label":"bbox_threshold"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[184],"slot_index":0,"shape":3,"label":"IMAGE"},{"name":"INPAINTING_MASK","type":"MASK","links":null,"shape":3,"label":"INPAINTING_MASK"}],"title":"MeshGraphormer Hand Refiner","properties":{"Node name for S&R":"MeshGraphormer+ImpactDetector-DepthMapPreprocessor"},"widgets_values":[0.38,40,2.2,15,30,"based_on_depth",8,8362359,512]},{"id":19,"type":"LoadImage","pos":{"0":310,"1":340},"size":[320,314],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[196,197,198],"slot_index":0,"shape":3,"label":"IMAGE"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"MASK"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["ComfyUI_00757_.png","image"],"color":"#232","bgcolor":"#353"},{"id":27,"type":"CheckpointLoaderSimple","pos":{"0":310,"1":700},"size":{"0":320,"1":100},"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[166],"slot_index":0,"shape":3,"label":"MODEL"},{"name":"CLIP","type":"CLIP","links":[203],"slot_index":1,"shape":3,"label":"CLIP"},{"name":"VAE","type":"VAE","links":[168],"slot_index":2,"shape":3,"label":"VAE"}],"properties":{"Node name for S&R":"CheckpointLoaderSimple"},"widgets_values":["tPonynai3_v65.safetensors"],"color":"#232","bgcolor":"#353"},{"id":111,"type":"ControlNetLoader","pos":{"0":530,"1":-30},"size":{"0":390,"1":60},"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"CONTROL_NET","type":"CONTROL_NET","links":[180],"shape":3,"label":"CONTROL_NET"}],"properties":{"Node name for S&R":"ControlNetLoader"},"widgets_values":["controlnetxlCNXL_bdsqlszDepth.safetensors"]}],"links":[[134,12,0,82,0,"BBOX_DETECTOR"],[166,27,0,98,2,"MODEL"],[168,27,2,98,4,"VAE"],[170,99,0,98,5,"CONDITIONING"],[171,100,0,98,6,"CONDITIONING"],[174,98,0,101,0,"IMAGE"],[176,102,0,98,0,"IMAGE"],[180,111,0,110,1,"CONTROL_NET"],[181,82,0,110,0,"SEGS"],[182,12,0,113,1,"BBOX_DETECTOR"],[184,113,0,110,3,"IMAGE"],[186,110,0,98,1,"SEGS"],[188,82,0,116,0,"SEGS"],[189,110,0,117,0,"SEGS"],[196,19,0,113,0,"IMAGE"],[197,19,0,82,1,"IMAGE"],[198,19,0,102,0,"*"],[203,27,1,124,0,"*"],[204,124,0,99,0,"CLIP"],[205,124,0,100,0,"CLIP"],[206,124,0,98,3,"CLIP"],[207,125,0,82,3,"FLOAT"],[208,125,0,113,2,"FLOAT"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.6830134553650707,"offset":[-40.609586368112105,13.934882963219849]}},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)



### Other

_No response_
ltdrdata commented 3 weeks ago

@comfyanonymous ComfyUI is failing to load this controlnet model. https://civitai.com/models/136070?modelVersionId=267507