LeapLabTHU / Agent-Attention

Official repository of Agent Attention (ECCV2024)
446 stars 31 forks source link

Inquiry About Integrating Agent Attention into xformers Library #33

Open XCZhou520 opened 2 months ago

XCZhou520 commented 2 months ago

Dear Dr. Han and Dr. Ye,

I have been greatly impressed by your work on the Agent Attention model, as detailed in your recent publication and the associated GitHub repository. The method of integrating Softmax with linear attention mechanisms to enhance computational efficiency while maintaining robust expressiveness is particularly compelling.

Given that the xformers library is a platform for optimizing and enhancing the efficiency of Transformers, I am curious to know if there are any plans to integrate the Agent Attention mechanism into xformers. Such an integration could potentially make your innovative approach more accessible and practical for a broader audience, enabling developers and researchers to utilize Agent Attention in real-world applications more readily.

Could you please share any information regarding plans to migrate Agent Attention code to xformers or similar libraries, or if there are any ongoing projects aimed at such integration?

Thank you for your time and consideration.

Best regards,

xczhou

tian-qing001 commented 2 months ago

Hi @XCZhou520, thanks for your interest in our work. We plan to apply xformers or flash attention to our agent attention in the future. And we also encourage and welcome contributions from the community to explore and achieve this.