DQiaole / MemFlow

[CVPR 2024] MemFlow: Optical Flow Estimation and Prediction with Memory
Apache License 2.0
114 stars 6 forks source link

SSM model integration instead of Flash Attention #9

Open looper99 opened 1 month ago

looper99 commented 1 month ago

Hi, very nice work! How easy is it to integrate SSM state space model here instead of Flash Attention?

Would it even help in anything? They claim they have O(N) complexity while this has O(N**2), the question is whether this can even outperform the existing approach with Flash Attention memory bank.

DQiaole commented 1 month ago

Hi, the idea is promising. SSM has lower computation complexity than our attention block. However, I think there should be a lot of effort and specific design for the memory bank.