emadeldeen24 / ECGTransForm

[Biomedical Signal Processing and Control] ECGTransForm: Empowering adaptive ECG arrhythmia classification framework with bidirectional transformer
https://www.sciencedirect.com/science/article/abs/pii/S1746809423011473
MIT License
29 stars 4 forks source link

Bidirectional transformer #7

Open hhhey-lw opened 4 months ago

hhhey-lw commented 4 months ago

I would like to ask why it is necessary to have a two-way Self-Attention. As far as I know, in Transformer, the MHSA of the Encoder part is a two-way concern, refer to BERT. The Mask MHSA in the Decoder part is to cover up the future information and only focus on the past information, refer to GPT. And in your code, the input to Transformer is not transposed, but it is standard practice to transform [batch,channel,length] to [batch,length,channel].