State space models could help with very long sequences found in some scientific datasets.
Recently Nvidia has implemented Mamba2 in Megatron-LM. Could we use that?
Hacking in naive Mamba2 example was fairly easy and has been ran on Sunspot but it was not memory or compute efficient as the efficient implementation is dependant on causal conv1d implemented in Cuda.
State space models could help with very long sequences found in some scientific datasets.
Recently Nvidia has implemented Mamba2 in Megatron-LM. Could we use that?
Hacking in naive Mamba2 example was fairly easy and has been ran on Sunspot but it was not memory or compute efficient as the efficient implementation is dependant on causal conv1d implemented in Cuda.