LeapLabTHU / Agent-Attention

Official repository of Agent Attention (ECCV2024)
473 stars 35 forks source link

注意力的维度 #13

Closed Eight3H closed 8 months ago

Eight3H commented 8 months ago

你好,作者,如果我现在不仅有h,w ,还有一个d ,作者有没有考虑过这方面,请您联系我www.756233138@foxmail.com

Eight3H commented 8 months ago

只针对 Agentattention那快的代码,注意力偏置

tian-qing001 commented 8 months ago

Hi @Eight3H, please provide more details regarding your question.

Eight3H commented 8 months ago

self.ah_bias = nn.Parameter(torch.zeros(1, num_heads, agent_num, window_size[0], 1)) self.aw_bias = nn.Parameter(torch.zeros(1, num_heads, agent_num, 1, window_size[1])) self.ha_bias = nn.Parameter(torch.zeros(1, num_heads, window_size[0], 1, agent_num)) self.wa_bias = nn.Parameter(torch.zeros(1, num_heads, 1, window_size[1], agent_num)) 你的注意力偏置是这样的,现在我不止有h和w,还有d也就是处理的是3d的 该怎么处理,有考虑过吗

tian-qing001 commented 8 months ago

It's essential to provide a clear and detailed description of your problem, avoiding assumptions about my understanding of the specific area and task you are addressing. If you are denoting depth by 'd', consider expanding the dimensions of agent bias and treating it similarly to height and width. If 'd' represents dimension, treat it in a manner analogous to num_heads.

Eight3H commented 8 months ago

是深度