Kosinkadink / ComfyUI-Advanced-ControlNet

ControlNet scheduling and masking nodes with sliding context support
GNU General Public License v3.0
577 stars 57 forks source link

use controlnet++ error:output with shape [1, 1280] doesn't match the broadcast shape #135

Closed xueqing0622 closed 1 month ago

xueqing0622 commented 3 months ago

with autocfg, If the end percent >0.75 File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control_plusplus.py", line 194, in forward emb += self.control_add_embedding(control_type, emb.dtype, emb.device) RuntimeError: output with shape [1, 1280] doesn't match the broadcast shape [2, 1280]

Kosinkadink commented 3 months ago

Can you provide some additional details? What is autocfg? Can you create a simple workflow that reproduces the error?

xueqing0622 commented 3 months ago

[rgthree] Using rgthree's optimized recursive execution. Model maximum sigma: 14.614640235900879 / Model minimum sigma: 0.029167160391807556 Sampling function patched. Uncond enabled from 1000 to 1 Requested to load AutoencoderKL Loading 1 new model Requested to load SDXL Requested to load ControlNetPlusPlus Loading 2 new models 75%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 6/8 [00:08<00:02, 1.38s/it] !!! Exception during processing!!! output with shape [1, 1280] doesn't match the broadcast shape [2, 1280] Traceback (most recent call last): File "F:\ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5413, in simple return super().run(pipe, None, None, None, None, None, image_output, link_id, save_prefix, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5382, in run return process_sample_state(pipe, samp_model, samp_clip, samp_samples, samp_vae, samp_seed, samp_positive, samp_negative, steps, start_step, last_step, cfg, sampler_name, scheduler, denoise, image_output, link_id, save_prefix, tile_size, prompt, extra_pnginfo, my_unique_id, preview_latent, force_full_denoise, disable_noise, samp_custom) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 5166, in process_sample_state samp_samples = sampler.common_ksampler(samp_model, samp_seed, steps, cfg, sampler_name, scheduler, samp_positive, samp_negative, samp_samples, denoise=denoise, preview_latent=preview_latent, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, disable_noise=disable_noise) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\libs\sampler.py", line 118, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample raise e File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample return original_sample(*args, *kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 410, in motion_sample return orig_comfy_sample(model, noise, args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control_reference.py", line 47, in refcn_sample return orig_comfy_sample(model, *args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 112, in uncond_multiplier_check_cn_sample return orig_comfy_sample(model, args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\sample.py", line 43, in sample samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 801, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 703, in sample return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 690, in sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 669, in inner_sample samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 574, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\k_diffusion\sampling.py", line 160, in sample_euler_ancestral denoised = model(x, sigmas[i] * s_in, extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 297, in call out = self.inner_model(x, sigma, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 656, in call return self.predict_noise(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 659, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AutomaticCFG\nodes.py", line 52, in sampling_function_patched out = comfy.samplers.calc_cond_batch(model, conds, x, timestep, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\comfy\samplers.py", line 200, in calc_cond_batch c['control'] = control.get_control(inputx, timestep, c, len(cond_or_uncond)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 741, in get_control_inject return self.get_control_advanced(x_noisy, t, cond, batched_number) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control.py", line 442, in get_control_advanced control_prev = self.previous_controlnet.get_control(x_noisy, t, cond, batched_number) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 741, in get_control_inject return self.get_control_advanced(x_noisy, t, cond, batched_number) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control_plusplus.py", line 339, in get_control_advanced control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, control_type=self.cond_hint_types) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl return self._call_impl(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control_plusplus.py", line 194, in forward emb += self.control_add_embedding(control_type, emb.dtype, emb.device) RuntimeError: output with shape [1, 1280] doesn't match the broadcast shape [2, 1280]

xueqing0622 commented 3 months ago

RuntimeError: output with shape [1, 1280] doesn't match the broadcast shape [2, 1280] This error is to use controlnet++ to modify it will be like this, first turn off all the controlnet to run a time, run and then open all theThe error is caused by using controlnet++.

xThIsIsBoToXx commented 2 months ago

I am running into the same issue with this somehow, but have it working in another workflow for some reason Edit: Figured out that this error is caused if you use the new ControlNet together with PAG

PrometheusDante commented 2 months ago

I am running into this error randomly as well (without using PAG) and noticed it never occurs when using only one type of ControlNet++, while using two types chained together or two passes with a single one already has a chance at producing that error.

Kosinkadink commented 2 months ago

Thank you for the reports. With the last two replies about PAG and other usage, I think I've got a clue as to the cause. I'll keep you posted, and should have a fix sometime this weekend.

Kosinkadink commented 2 months ago

I spent several hours looking into this, and I cannot replicate it whatsoever. I tried with and without PAG, no issues. I tried in Single mode, no issues. I tried in Multi mode (up to three), no issues. I tried modifying code to replicate a situation in which different steps would batch the conds and unconds in different amounts like what happens if calculated free VRAM changes between different steps, no issues. Tried chaining multiple KSamplers with the same ControlNet object plugged in, no issues. Tried disabling the controlnet, sampling, then re-enabling and sampling, no issues.

Can peeps in this thread create the simplest workflow you possible can, that does not use any external nodes besides ACN, ADE, and VHS, that replicates this issue?

Kosinkadink commented 2 months ago

Also to clear things up, to the people who experienced the issue with PAG, what node do you use for PAG? I used the built-in ComfyUI PAG node in testing: image

grinlau18 commented 2 months ago

I encountered the same problem. When I combined three or more Conditioning and used ‘Load ControlNet++ Model (Multi)’ node , Ksampler would report the same error.

Error occurred when executing KSampler:

output with shape [1, 1280] doesn't match the broadcast shape [2, 1280]

File "/root/autodl-tmp/ComfyUI/execution.py", line 316, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "/root/autodl-tmp/ComfyUI/execution.py", line 191, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) File "/root/autodl-tmp/ComfyUI/execution.py", line 168, in _map_node_over_list process_inputs(input_dict, i) File "/root/autodl-tmp/ComfyUI/execution.py", line 157, in process_inputs results.append(getattr(obj, func)(inputs)) File "/root/autodl-tmp/ComfyUI/nodes.py", line 1429, in sample return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise) File "/root/autodl-tmp/ComfyUI/nodes.py", line 1396, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Impact-Pack/modules/impact/sample_error_enhancer.py", line 22, in informative_sample raise e File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Impact-Pack/modules/impact/sample_error_enhancer.py", line 9, in informative_sample return original_sample(*args, *kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations. File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-AnimateDiff/animatediff/sampling.py", line 410, in motion_sample return orig_comfy_sample(model, noise, args, kwargs) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-AnimateDiff-Evolved/animatediff/sampling.py", line 434, in motion_sample return orig_comfy_sample(model, noise, *args, kwargs) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Advanced-ControlNet/adv_control/sampling.py", line 116, in acn_sample return orig_comfy_sample(model, *args, *kwargs) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Advanced-ControlNet/adv_control/utils.py", line 116, in uncond_multiplier_check_cn_sample return orig_comfy_sample(model, args, kwargs) File "/root/autodl-tmp/ComfyUI/comfy/sample.py", line 43, in sample samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 829, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-BrushNet/model_patch.py", line 120, in modified_sample return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 716, in sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 695, in inner_sample samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 600, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options) File "/root/miniconda3/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) File "/root/autodl-tmp/ComfyUI/comfy/k_diffusion/sampling.py", line 612, in sample_dpmpp_sde denoised = model(x, sigmas[i] s_in, extra_args) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 299, in call out = self.inner_model(x, sigma, model_options=model_options, seed=seed) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 682, in call return self.predict_noise(*args, kwargs) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 685, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 279, in sampling_function out = calc_cond_batch(model, conds, x, timestep, model_options) File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 202, in calc_cond_batch c['control'] = control.get_control(inputx, timestep, c, len(cond_or_uncond)) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Advanced-ControlNet/adv_control/utils.py", line 758, in get_control_inject return self.get_control_advanced(x_noisy, t, cond, batched_number) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Advanced-ControlNet/adv_control/control_plusplus.py", line 339, in get_control_advanced control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, control_type=self.cond_hint_types) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl return self._call_impl(*args, *kwargs) File "/root/miniconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl return forward_call(args, kwargs) File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Advanced-ControlNet/adv_control/control_plusplus.py", line 194, in forward emb += self.control_add_embedding(control_type, emb.dtype, emb.device)

zhiselfly commented 1 month ago

@Kosinkadink I found that at line 174 of comfy/samplers.py within comfyui, due to insufficient memory, batch_chunks becomes 1 at line 196 and consequently affects the ControlNetPlusPlusAdvanced.get_control_advanced method where batched_number turns out to be 1.

Concerning this,

control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, control_type=self.cond_hint_types)

In the context provided, the length of timestep is 1, yet the parameter control_type, specifically self.cond_hint_types, which receives a vector indicating different conditioning types for each batch item, has a length of 2.

To slove this issue temporarily, I modified how I call self.control_model by introducing a queue that passes cond_hint_types based on the updated batched_number. Although this seems to resolve the problem in the short term, since I am not deeply familiar with comfyui or the specifics of this plugin's codebase, I'm uncertain about the appropriateness of making such modifications.

if batched_number == 1:
    if self.step_ctrl_cahce is None or len(self.step_ctrl_cahce) <= 0:
        self.step_ctrl_cahce = [self.cond_hint_types[i] for i in range(self.cond_hint_types.shape[0])]
    cur_ctrl_type = self.step_ctrl_cahce.pop(0)[None, :]
    control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, control_type=cur_ctrl_type)
else:
    control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.float(), context=context.to(dtype), y=y, control_type=self.cond_hint_types)

Also, I add self.step_ctrl_cahce = None in ControlNetPlusPlusAdvanced.init

By the way, my workflow like this:

image
Kosinkadink commented 1 month ago

Good catch, thanks for looking into it. It would explain why I was unable to replicate the issue back when I looked into this - the order of the batches number changing matters in order to run into the issue alongside the number of cond_hint types. Setting except_one to False on that call should fix the issue. Would you be able to try that on your end to see if it fixes it and doesn't break normal ControlNet++ usage?

If so, I'll get it merged right in.

zhiselfly commented 1 month ago

I think the problem is caused by the reuse of ControlNetPlusPlusAdvanced

zhiselfly commented 1 month ago
image

When the get_control_advanced method of the ControlNetPlusPlusAdvanced instance is called for the first time, the shape of self.cond_hint_types will become [1, x]. When broadcast_image_to_extend is called for the first time (the second red box in the figure above), assuming that batched_number is 2, self.cond_hint_types will become [2,x]. However, when broadcast_image_to_extend of this instance is called for the second time, due to insufficient memory, batched_number is changed to 1. Because expectOne=True in the broadcast_image_to_extend method, the cond_hint_types with a shape of [2,x] will be returned intact, resulting in subsequent errors!

zhiselfly commented 1 month ago

Good catch, thanks for looking into it. It would explain why I was unable to replicate the issue back when I looked into this - the order of the batches number changing matters in order to run into the issue alongside the number of cond_hint types. Setting except_one to False on that call should fix the issue. Would you be able to try that on your end to see if it fixes it and doesn't break normal ControlNet++ usage?

If so, I'll get it merged right in.

OK.

zhiselfly commented 1 month ago

Simply setting except_one to False does not solve the problem. It should change like this:

image

to

image

Here is my PR

Kosinkadink commented 1 month ago

The issue should now be fixed, I merged zhiselfly's PR. If there is still an issue after getting the latest changes, please reopen the issue.