LeapLabTHU / Agent-Attention

Official repository of Agent Attention (ECCV2024)
473 stars 35 forks source link

forward() got an unexpected keyword argument 'encoder_hidden_states' #18

Open honglunzhang-mt opened 7 months ago

honglunzhang-mt commented 7 months ago

hello, I greatly appreciate your awesome work.

It seems that self.attn1 in https://github.com/LeapLabTHU/Agent-Attention/blob/master/agentsd/patch.py#L220 is replaced as AgentAttention, whose forward function accepts forward(self, x, agent=None, context=None, mask=None) as parameters. However, in https://github.com/LeapLabTHU/Agent-Attention/blob/master/agentsd/patch.py#L220, encoder_hidden_states and attention_mask are passed to self.attn1, which causes the problem forward() got an unexpected keyword argument 'encoder_hidden_states'.

do you have any solution? thanks a lot

honglunzhang-mt commented 7 months ago

the above question may be caused by the fact that currently agentsd doesn't support diffusers

tian-qing001 commented 7 months ago

Hi @honglunzhang-mt. I greatly appreciate your interest in our work. Currently, agentsd does not support diffusers, but we plan to include support for it in a few weeks. We also encourage and welcome contributions from the community to help achieve this.