LeapLabTHU / FLatten-Transformer

Official repository of FLatten Transformer (ICCV2023)
367 stars 20 forks source link

Subject: Inquiry About Lightweight Feature Extraction with Your Attention Mechanism #25

Closed Zhangyuhaoo closed 3 weeks ago

Zhangyuhaoo commented 3 weeks ago

I hope this message finds you well. I recently read your impressive paper on [FLatten Transformer: Vision Transformer using Focused Linear Attention], and I must say I was truly amazed by your work.

I am currently working on a task related to feature point extraction and matching, and my focus is on developing lightweight models. I am particularly interested in whether it would be feasible to replace the standard self-attention mechanisms in backbone networks with the attention mechanism you proposed in your research.

I would be grateful for your insights or suggestions on this approach. I apologize for any inconvenience my inquiry might cause and look forward to your response.

Thank you very much, and best wishes.

tian-qing001 commented 3 weeks ago

Hi @Zhangyuhaoo, thanks for your interest in our work. Focused linear attention can be used as an effecient and effective alternative to the standard Softmax self-attention in various models. And there are already some works working on this. Maybe you can try our method in your model.