state-spaces / mamba

Mamba SSM architecture
Apache License 2.0
11.71k stars 963 forks source link

Mamba for EMG signal analysis #162

Open mohdil23 opened 5 months ago

mohdil23 commented 5 months ago

Hello I have EMG time series needle signals, which are 1D signals. These are very lengthy and range from 300,000 to over 1,000,000 data points each.

How can the Mamba code be implemented to take (batch size, 1 channel, length of emg signals) to predict one class label?

Happy to chat more if needed.

Regards

MT

tridao commented 5 months ago

It's a sequence-to-sequence mapping, with input (batch, seqlen, dim) and output (batch, seqlen, dim). You can use it in the same way you'd use any other sequence-to-sequence layer (e.g. attention).

dr-smgad commented 5 months ago

@mohdil23 as said above, you shall treat Mamba as similar to an LSTM or GRU cell. To perform classification on top of it, you could add a classification head (e.g., ReLU layer(s) +Output layer dependent on your class label like Softmax or Sigmoid) that takes the lats y as an input and outputs your class label.