state-spaces / mamba

Mamba SSM architecture
Apache License 2.0
13.06k stars 1.11k forks source link

Consider allowing hidden state initialisation via ssm_state input parameter for selective_scan_fn #258

Open govorunov opened 7 months ago

govorunov commented 7 months ago

Please, please, consider adding the ssm_state input parameter for selective_scan_fn to allow hidden state initialisation for the Mamba block. Also please consider making hidden state differentiable as currently at selective_scan_fn we have:

Note that the gradient of the last state is not considered in the backward pass.

This change should potentially open the path for encoder-decoder Mamba architecture and for the encoder-only BERT-like architecture. The architecture analogous to RNNs would be - Mamba encoder goes through the input sequence ignoring output, the last hidden state then used to initialize the decoder with input token and the decoder unrolls the state recursively. For the encoder to work last hidden state has to be differentiable. This also should open a route to encoder-only BERT architecture, classification/embedding problems, etc. For the decoder to work the Mamba block needs to be able to accept a hidden state at initialisation.

Related issues: #233 , #101

PS: Excellent work! Very impressive (especially the CUDA part)!

LechengKong commented 7 months ago

Upvoting this issue and agreeing to all points that @govorunov mentioned. Being able to manipulate/add learning modules on the differentiable hidden states opens many new possible ways to use Mamba.

Xudangliatiger commented 7 months ago

+1

brightonm commented 7 months ago

+1

XiangPiIi commented 5 months ago

+1

peterukk commented 2 months ago

Yes please, this is pretty much a must have for my application too in weather and climate