pengzhangzhi / Awesome-Mamba

Awesome list of papers that extend Mamba to various applications.
110 stars 8 forks source link

Repeat After Me: Transformers are Better than State Space Models at Copying #3

Open GianlucaMancusi opened 4 months ago

GianlucaMancusi commented 4 months ago

I believe it is essential also to comprehend the limits of Mamba. This paper shows that any state-space model fails to solve the copy task unless its latent state grows linearly with the sequence length: https://arxiv.org/pdf/2402.01032.pdf

pengzhangzhi commented 4 months ago

Thanks!!! That's a good paper. My takeaways are that it might be necessary to increase the hidden dim for the model to memorize the prev tokens when sequences get longer.

pengzhangzhi commented 4 months ago

I am gonna leave this issue open for discussion of the paper.