argonne-lcf / Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2
Other
9 stars 12 forks source link

Feature: State space models #46

Open hatanp opened 3 months ago

hatanp commented 3 months ago

State space models could help with very long sequences found in some scientific datasets.

Recently Nvidia has implemented Mamba2 in Megatron-LM. Could we use that?

Hacking in naive Mamba2 example was fairly easy and has been ran on Sunspot but it was not memory or compute efficient as the efficient implementation is dependant on causal conv1d implemented in Cuda.