rmpeng / TIE-EEGNet

Temporal information enhanced EEGNet
MIT License
10 stars 1 forks source link

EnK Layer Pytorch implementation #1

Open MohSamNaf opened 6 months ago

MohSamNaf commented 6 months ago

Hello Thank you for uploading your work github. The paper has been an interesting read. I noticed in your code that you are implementing the EnkLayer, but it is not mentioned in the uploaded code? Have you used this part from here? Or this layer is actually _TIE_Layer not EnkLayer in the fea_model?

If EnkLayer was what is meant, the implementation is done in TensorFlow, while your work is implemented in PyTorch. Do you have a full implementation of the EnkLayer that works with your proposed TIE_EEGNET model?

Thank you

ErHai1 commented 3 months ago

@MohSamNaf Hello, I'm having the same problem as you are. Can you please tell me if you have solved this problem ?

MohSamNaf commented 3 months ago

@MohSamNaf Hello, I'm having the same problem as you are. Can you please tell me if you have solved this problem ?

Hello. You don't need the Enk Layer at all for the TIE-EEGNet model to work. I didn't solve the part regarding the EnkLayer, I adopted their work and integrated the TIE_Layer with the original EEGNet architecture.

rmpeng commented 1 month ago

Sorry for the late reply!

The EnK method was proposed in [EnK](Singh, A.K. and Lin, C.T., 2020. EnK: Encoding time-information in convolution. arXiv preprint arXiv:2006.04198.), and its code is here.

In my study, however, for the late sample points, such as the end-point of the 4-second sample in my experiments, EnK assigns positional encodes with large values (for example, 511), which may overwhelm the feature map's value. Thus, We replaced the linear encoder EnK module with the TIE module, which is a recurrent encoder. Therefore, TIE-EEGNet doesn't have an EnK layer.

BTW, for implementing an EnK-EEGNet, you can change the self.TIE (and delete the self.alpha) in TIE_Layer into an 'EnK' encode, referring to the original paper and code of EnK.

Good Luck!