huggingface / diffusers

🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
https://huggingface.co/docs/diffusers
Apache License 2.0
26.34k stars 5.43k forks source link

'tuple' object has no attribute 'shape' with processor=AttnProcessor() #9901

Closed massyzs closed 1 week ago

massyzs commented 2 weeks ago

Describe the bug

I'm working on modifying the attention, but when I set (processor=AttnProcessor()), it goes wrong combined with ip_adaptor:

hidden_states.shape if encoder_hidden_states is None else encoder_hidden_states.shape

AttributeError: 'tuple' object has no attribute 'shape'

Reproduction

from diffusers import StableDiffusionControlNetPipeline, ControlNetModel import torch from diffusers.utils import load_image

if name == "main": controlnet_model_path = "lllyasviel/control_v11f1p_sd15_depth" controlnet = ControlNetModel.from_pretrained(controlnet_model_path, torch_dtype=torch.float16)

pipeline = StableDiffusionControlNetPipeline.from_pretrained(
    "runwayml/stable-diffusion-v1-5", controlnet=controlnet, torch_dtype=torch.float16)
pipeline.to("cuda")
pipeline.load_ip_adapter("h94/IP-Adapter", subfolder="models", weight_name="ip-adapter_sd15.bin")
ip_adapter_image = load_image("https://huggingface.co/datasets/YiYiXu/testing-images/resolve/main/statue.png")
depth_map = load_image("https://huggingface.co/datasets/YiYiXu/testing-images/resolve/main/depth.png")
generator = torch.Generator(device="cpu").manual_seed(33)
from diffusers.models.attention_processor import AttnProcessor
pipeline.unet.set_attn_processor(processor=AttnProcessor())
pipeline.controlnet.set_attn_processor(processor=AttnProcessor())
image = pipeline(
    prompt="best quality, high quality",
    image=depth_map,
    ip_adapter_image=ip_adapter_image,
    negative_prompt="monochrome, lowres, bad anatomy, worst quality, low quality",
    num_inference_steps=50,
    generator=generator,
).images[0]

Logs

No response

System Info

torch:2.4.1 diffusers 0.26.0

Who can help?

No response

elismasilva commented 2 weeks ago

@massyzs if you clone main repository this bug was recently fixed in core attentions #9881

yiyixuxu commented 1 week ago

closing as fixed by @elismasilva