Shakker-Labs / ComfyUI-IPAdapter-Flux

Apache License 2.0
161 stars 6 forks source link

RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: struct c10::Half key.dtype: struct c10::BFloat16 and value.dtype: struct c10::BFloat16 instead. #10

Open msola-ht opened 5 days ago

msola-ht commented 5 days ago

2080TI graphics card

loaded completely 0.0 14086.067443847656 True 0%| | 0/25 [00:00<?, ?it/s] !!! Exception during processing !!! Expected query, key, and value to have the same dtype, but got query.dtype: struct c10::Half key.dtype: struct c10::BFloat16 and value.dtype: struct c10::BFloat16 instead. Traceback (most recent call last): File "F:\AIGC\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "F:\AIGC\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 740, in sample output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 719, in inner_sample samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\utils.py", line 34, in KSAMPLER_sample return orig_fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 624, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\env\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler denoised = model(x, sigma_hat * s_in, **extra_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 299, in __call__ out = self.inner_model(x, sigma, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 706, in __call__ return self.predict_noise(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 709, in predict_noise return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 279, in sampling_function out = calc_cond_batch(model, conds, x, timestep, model_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\samplers.py", line 228, in calc_cond_batch output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 69, in apply_model_uncond_cleanup_wrapper return orig_apply_model(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\model_base.py", line 145, in apply_model model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\env\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\env\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\comfy\ldm\flux\model.py", line 184, in forward out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\custom_nodes\ComfyUI-IPAdapter-Flux\utils.py", line 84, in forward_orig_ipa img, txt = block(img=img, txt=txt, vec=vec, pe=pe, t=timesteps) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\env\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\env\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\custom_nodes\ComfyUI-IPAdapter-Flux\flux\layers.py", line 58, in forward ip_hidden_states = self.ip_adapter(self.num_heads, img_q, self.image_emb, t) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\AIGC\ComfyUI\custom_nodes\ComfyUI-IPAdapter-Flux\attention_processor.py", line 48, in __call__ ip_hidden_states = F.scaled_dot_product_attention(query.to(image_emb.device), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

philipy1219 commented 4 days ago

The 2080 Ti does not support bfloat16. We will submit a new version tomorrow to add an autocasting mechanism.

Amazon90 commented 3 days ago

The 2080 Ti does not support bfloat16. We will submit a new version tomorrow to add an autocasting mechanism.

2

ComfyUI Error Report

Error Details


## System Information
- **ComfyUI Version:** v0.3.4
- **Arguments:** D:\ComfyUI\main.py --auto-launch --preview-method auto --disable-cuda-malloc --fast
- **OS:** nt
- **Python Version:** 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
- **Embedded Python:** false
- **PyTorch Version:** 2.5.1+cu124
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 4070 Ti : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 12878086144
  - **VRAM Free:** 1624655544
  - **Torch VRAM Total:** 9898557440
  - **Torch VRAM Free:** 84256440
Amazon90 commented 3 days ago

The 2080 Ti does not support bfloat16. We will submit a new version tomorrow to add an autocasting mechanism.

The issue has not been fixed; instead, problems have arisen with the 40 series graphics cards after the update.

slmonker commented 3 days ago

after updated today,4090 same error

ericxl277 commented 3 days ago

me too,updated today and get the same error.

slmonker commented 3 days ago

me too,updated today and get the same error.

it's working now,new upadate