L-guangQ / MSLTE

This is an implentation of paper “MSLTE: multiple self-supervised learning tasks for enhancing EEG emotion recognition”
5 stars 1 forks source link

Preprocess #2

Open Xiaochi111 opened 1 month ago

Xiaochi111 commented 1 month ago

Hello author, I would like to ask about the preprocessing of DEAP data. How do you extract DE features specifically, and what is the final shape of the processed data?

L-guangQ commented 1 month ago

We extracted DE by channel for each of the four frequency bands, Theta (4-7 Hz), Alpha (8-13 Hz), Beta (14-30 Hz), and Gamma (31-50 Hz). The final shape of the data is 406032*4, with 40 being the trials, 60 being the seconds, 32 being the channels, and 4 being the frequency bands.

Xiaochi111 commented 1 month ago

We extracted DE by channel for each of the four frequency bands, Theta (4-7 Hz), Alpha (8-13 Hz), Beta (14-30 Hz), and Gamma (31-50 Hz). The final shape of the data is 40_60_32*4, with 40 being the trials, 60 being the seconds, 32 being the channels, and 4 being the frequency bands.

Hello Author, thank you for taking the time to guide me out of my confusion. Your answer was very helpful to me. I have one more question. Currently, I am reproducing your code for the SEED dataset subject-independent experiment in training, and the accuracy is maintained at around 98%. However, the accuracy in testing is only 45%. Could you please tell me if there is anything I am doing wrong or if there is any reason for this? Thank you very much!

L-guangQ commented 1 month ago

The results of our experiments are all consistent with baselines, and are the best test results for obtaining the best epochs. In addition, the accuracy of each subject is selected for the best session. I don't know if you do the same.

Xiaochi111 commented 1 month ago

The results of our experiments are all consistent with baselines, and are the best test results for obtaining the best epochs. In addition, the accuracy of each subject is selected for the best session. I don't know if you do the same.

Hello, I have not modified the code related to this aspect and have kept it consistent with the code you have made public. Below is the content of the seed_indep.txt file generated by the model.My parameters are the same as in the source code. Could it be that SEED_indep has some parameters that are different from SEED. k = 2 mask_rate=0.7" ---------- 2024-09-25 16:22:53.503684 ---------- s01, day-1, acc-0.437831 s01, day-2, acc-0.408662 s01, day-3, acc-0.481732 ####### SEED_indep ######Test_acc: 0.4427±0.0300 s02, day-1, acc-0.453742 s02, day-2, acc-0.399529 s02, day-3, acc-0.543312 ####### SEED_indep ######Test_acc: 0.4541±0.0484 s03, day-1, acc-0.481143 s03, day-2, acc-0.428403 s03, day-3, acc-0.476134 ####### SEED_indep ######Test_acc: 0.4567±0.0420 s04, day-1, acc-0.454037 s04, day-2, acc-0.464938 s04, day-3, acc-0.490572 ####### SEED_indep ######Test_acc: 0.4600±0.0376 s05, day-1, acc-0.363288 s05, day-2, acc-0.533883 s05, day-3, acc-0.476134 ####### SEED_indep ######Test_acc: 0.4596±0.0462 s06, day-1, acc-0.426341 s06, day-2, acc-0.352681 s06, day-3, acc-0.466411 ####### SEED_indep ######Test_acc: 0.4522±0.0492 s07, day-1, acc-0.483795 s07, day-2, acc-0.431055 s07, day-3, acc-0.532115 ####### SEED_indep ######Test_acc: 0.4565±0.0493 s08, day-1, acc-0.567177 s08, day-2, acc-0.443430 s08, day-3, acc-0.492634 ####### SEED_indep ######Test_acc: 0.4620±0.0517 s09, day-1, acc-0.502946 s09, day-2, acc-0.500000 s09, day-3, acc-0.424573 ####### SEED_indep ######Test_acc: 0.4636±0.0504 s10, day-1, acc-0.414555 s10, day-2, acc-0.435474 s10, day-3, acc-0.488804 ####### SEED_indep ######Test_acc: 0.4618±0.0491 s11, day-1, acc-0.479670 "

L-guangQ commented 1 month ago

The worst results I got were not that low. You can try this set of parameters: a learning rate of 0.005 or 0.05 and a batch of 1024. Here are our previous results: k = 2 mask_rate=0.7" ---------- 2023-05-22 20:15:29.809216 ---------- s01, day-1, acc-0.747201 s01, day-2, acc-0.713907 s01, day-3, acc-0.531526 ####### SEED_indep ######Test_acc: 0.6642±0.0948 s02, day-1, acc-0.794932 s02, day-2, acc-0.833235 s02, day-3, acc-0.717737 ####### SEED_indep ######Test_acc: 0.7231±0.0955 s03, day-1, acc-0.769004 s03, day-2, acc-0.734237 s03, day-3, acc-0.858279 ####### SEED_indep ######Test_acc: 0.7445±0.0889 s04, day-1, acc-0.857395 s04, day-2, acc-0.698586 s04, day-3, acc-0.458751 ####### SEED_indep ######Test_acc: 0.7262±0.1168 s05, day-1, acc-0.784031 s05, day-2, acc-0.768415 s05, day-3, acc-0.764585 ####### SEED_indep ######Test_acc: 0.7355±0.1061 s06, day-1, acc-0.591043 s06, day-2, acc-0.725987 s06, day-3, acc-0.815852 ####### SEED_indep ######Test_acc: 0.7314±0.1044 s07, day-1, acc-0.758103 s07, day-2, acc-0.643489 s07, day-3, acc-0.709193 ####### SEED_indep ######Test_acc: 0.7274±0.0987 s08, day-1, acc-0.862699 s08, day-2, acc-0.890984 s08, day-3, acc-0.911314 ####### SEED_indep ######Test_acc: 0.7475±0.1068 s09, day-1, acc-0.653506 s09, day-2, acc-0.874484 s09, day-3, acc-0.580436 ####### SEED_indep ######Test_acc: 0.7426±0.1099 s10, day-1, acc-0.691220 s10, day-2, acc-0.710666 s10, day-3, acc-0.639069 ####### SEED_indep ######Test_acc: 0.7363±0.1063 s11, day-1, acc-0.797584 s11, day-2, acc-0.813200 s11, day-3, acc-0.837065 ####### SEED_indep ######Test_acc: 0.7436±0.1041 s12, day-1, acc-0.825575 s12, day-2, acc-0.679729 s12, day-3, acc-0.706541 ####### SEED_indep ######Test_acc: 0.7430±0.1013 s13, day-1, acc-0.745728 s13, day-2, acc-0.742192 s13, day-3, acc-0.739246 ####### SEED_indep ######Test_acc: 0.7430±0.0973 s14, day-1, acc-0.769593 s14, day-2, acc-0.732174 s14, day-3, acc-0.780200 ####### SEED_indep ######Test_acc: 0.7443±0.0941 s15, day-1, acc-0.686800 s15, day-2, acc-0.730701 s15, day-3, acc-0.984973 ####### SEED_indep ######Test_acc: 0.7480±0.0980

Xiaochi111 commented 1 month ago

The worst results I got were not that low. You can try this set of parameters: a learning rate of 0.005 or 0.05 and a batch of 1024. Here are our previous results: k = 2 mask_rate=0.7" ---------- 2023-05-22 20:15:29.809216 ---------- s01, day-1, acc-0.747201 s01, day-2, acc-0.713907 s01, day-3, acc-0.531526 ####### SEED_indep ######Test_acc: 0.6642±0.0948 s02, day-1, acc-0.794932 s02, day-2, acc-0.833235 s02, day-3, acc-0.717737 ####### SEED_indep ######Test_acc: 0.7231±0.0955 s03, day-1, acc-0.769004 s03, day-2, acc-0.734237 s03, day-3, acc-0.858279 ####### SEED_indep ######Test_acc: 0.7445±0.0889 s04, day-1, acc-0.857395 s04, day-2, acc-0.698586 s04, day-3, acc-0.458751 ####### SEED_indep ######Test_acc: 0.7262±0.1168 s05, day-1, acc-0.784031 s05, day-2, acc-0.768415 s05, day-3, acc-0.764585 ####### SEED_indep ######Test_acc: 0.7355±0.1061 s06, day-1, acc-0.591043 s06, day-2, acc-0.725987 s06, day-3, acc-0.815852 ####### SEED_indep ######Test_acc: 0.7314±0.1044 s07, day-1, acc-0.758103 s07, day-2, acc-0.643489 s07, day-3, acc-0.709193 ####### SEED_indep ######Test_acc: 0.7274±0.0987 s08, day-1, acc-0.862699 s08, day-2, acc-0.890984 s08, day-3, acc-0.911314 ####### SEED_indep ######Test_acc: 0.7475±0.1068 s09, day-1, acc-0.653506 s09, day-2, acc-0.874484 s09, day-3, acc-0.580436 ####### SEED_indep ######Test_acc: 0.7426±0.1099 s10, day-1, acc-0.691220 s10, day-2, acc-0.710666 s10, day-3, acc-0.639069 ####### SEED_indep ######Test_acc: 0.7363±0.1063 s11, day-1, acc-0.797584 s11, day-2, acc-0.813200 s11, day-3, acc-0.837065 ####### SEED_indep ######Test_acc: 0.7436±0.1041 s12, day-1, acc-0.825575 s12, day-2, acc-0.679729 s12, day-3, acc-0.706541 ####### SEED_indep ######Test_acc: 0.7430±0.1013 s13, day-1, acc-0.745728 s13, day-2, acc-0.742192 s13, day-3, acc-0.739246 ####### SEED_indep ######Test_acc: 0.7430±0.0973 s14, day-1, acc-0.769593 s14, day-2, acc-0.732174 s14, day-3, acc-0.780200 ####### SEED_indep ######Test_acc: 0.7443±0.0941 s15, day-1, acc-0.686800 s15, day-2, acc-0.730701 s15, day-3, acc-0.984973 ####### SEED_indep ######Test_acc: 0.7480±0.0980

Hello, changing batchsize and learning rate does not change the result. Because the experimental results in the SEED data set are consistent with your paper. So I would like to ask if there are any changes in the code for the two tests of SEED and seed-indep.

L-guangQ commented 1 month ago

The worst results I got were not that low. You can try this set of parameters: a learning rate of 0.005 or 0.05 and a batch of 1024. Here are our previous results: k = 2 mask_rate=0.7" ---------- 2023-05-22 20:15:29.809216 ---------- s01, day-1, acc-0.747201 s01, day-2, acc-0.713907 s01, day-3, acc-0.531526 ####### SEED_indep ######Test_acc: 0.6642±0.0948 s02, day-1, acc-0.794932 s02, day-2, acc-0.833235 s02, day-3, acc-0.717737 ####### SEED_indep ######Test_acc: 0.7231±0.0955 s03, day-1, acc-0.769004 s03, day-2, acc-0.734237 s03, day-3, acc-0.858279 ####### SEED_indep ######Test_acc: 0.7445±0.0889 s04, day-1, acc-0.857395 s04, day-2, acc-0.698586 s04, day-3, acc-0.458751 ####### SEED_indep ######Test_acc: 0.7262±0.1168 s05, day-1, acc-0.784031 s05, day-2, acc-0.768415 s05, day-3, acc-0.764585 ####### SEED_indep ######Test_acc: 0.7355±0.1061 s06, day-1, acc-0.591043 s06, day-2, acc-0.725987 s06, day-3, acc-0.815852 ####### SEED_indep ######Test_acc: 0.7314±0.1044 s07, day-1, acc-0.758103 s07, day-2, acc-0.643489 s07, day-3, acc-0.709193 ####### SEED_indep ######Test_acc: 0.7274±0.0987 s08, day-1, acc-0.862699 s08, day-2, acc-0.890984 s08, day-3, acc-0.911314 ####### SEED_indep ######Test_acc: 0.7475±0.1068 s09, day-1, acc-0.653506 s09, day-2, acc-0.874484 s09, day-3, acc-0.580436 ####### SEED_indep ######Test_acc: 0.7426±0.1099 s10, day-1, acc-0.691220 s10, day-2, acc-0.710666 s10, day-3, acc-0.639069 ####### SEED_indep ######Test_acc: 0.7363±0.1063 s11, day-1, acc-0.797584 s11, day-2, acc-0.813200 s11, day-3, acc-0.837065 ####### SEED_indep ######Test_acc: 0.7436±0.1041 s12, day-1, acc-0.825575 s12, day-2, acc-0.679729 s12, day-3, acc-0.706541 ####### SEED_indep ######Test_acc: 0.7430±0.1013 s13, day-1, acc-0.745728 s13, day-2, acc-0.742192 s13, day-3, acc-0.739246 ####### SEED_indep ######Test_acc: 0.7430±0.0973 s14, day-1, acc-0.769593 s14, day-2, acc-0.732174 s14, day-3, acc-0.780200 ####### SEED_indep ######Test_acc: 0.7443±0.0941 s15, day-1, acc-0.686800 s15, day-2, acc-0.730701 s15, day-3, acc-0.984973 ####### SEED_indep ######Test_acc: 0.7480±0.0980

Hello, changing batchsize and learning rate does not change the result. Because the experimental results in the SEED data set are consistent with your paper. So I would like to ask if there are any changes in the code for the two tests of SEED and seed-indep.

As far as I remember there are only adjustments for the parameters, the others are unchanged. Perhaps you could also try changing the axis to 1 at 135 line in load_data.py to standardize the sample dimensions.

Xiaochi111 commented 3 weeks ago

The worst results I got were not that low. You can try this set of parameters: a learning rate of 0.005 or 0.05 and a batch of 1024. Here are our previous results: k = 2 mask_rate=0.7" ---------- 2023-05-22 20:15:29.809216 ---------- s01, day-1, acc-0.747201 s01, day-2, acc-0.713907 s01, day-3, acc-0.531526 ####### SEED_indep ######Test_acc: 0.6642±0.0948 s02, day-1, acc-0.794932 s02, day-2, acc-0.833235 s02, day-3, acc-0.717737 ####### SEED_indep ######Test_acc: 0.7231±0.0955 s03, day-1, acc-0.769004 s03, day-2, acc-0.734237 s03, day-3, acc-0.858279 ####### SEED_indep ######Test_acc: 0.7445±0.0889 s04, day-1, acc-0.857395 s04, day-2, acc-0.698586 s04, day-3, acc-0.458751 ####### SEED_indep ######Test_acc: 0.7262±0.1168 s05, day-1, acc-0.784031 s05, day-2, acc-0.768415 s05, day-3, acc-0.764585 ####### SEED_indep ######Test_acc: 0.7355±0.1061 s06, day-1, acc-0.591043 s06, day-2, acc-0.725987 s06, day-3, acc-0.815852 ####### SEED_indep ######Test_acc: 0.7314±0.1044 s07, day-1, acc-0.758103 s07, day-2, acc-0.643489 s07, day-3, acc-0.709193 ####### SEED_indep ######Test_acc: 0.7274±0.0987 s08, day-1, acc-0.862699 s08, day-2, acc-0.890984 s08, day-3, acc-0.911314 ####### SEED_indep ######Test_acc: 0.7475±0.1068 s09, day-1, acc-0.653506 s09, day-2, acc-0.874484 s09, day-3, acc-0.580436 ####### SEED_indep ######Test_acc: 0.7426±0.1099 s10, day-1, acc-0.691220 s10, day-2, acc-0.710666 s10, day-3, acc-0.639069 ####### SEED_indep ######Test_acc: 0.7363±0.1063 s11, day-1, acc-0.797584 s11, day-2, acc-0.813200 s11, day-3, acc-0.837065 ####### SEED_indep ######Test_acc: 0.7436±0.1041 s12, day-1, acc-0.825575 s12, day-2, acc-0.679729 s12, day-3, acc-0.706541 ####### SEED_indep ######Test_acc: 0.7430±0.1013 s13, day-1, acc-0.745728 s13, day-2, acc-0.742192 s13, day-3, acc-0.739246 ####### SEED_indep ######Test_acc: 0.7430±0.0973 s14, day-1, acc-0.769593 s14, day-2, acc-0.732174 s14, day-3, acc-0.780200 ####### SEED_indep ######Test_acc: 0.7443±0.0941 s15, day-1, acc-0.686800 s15, day-2, acc-0.730701 s15, day-3, acc-0.984973 ####### SEED_indep ######Test_acc: 0.7480±0.0980

Hello, changing batchsize and learning rate does not change the result. Because the experimental results in the SEED data set are consistent with your paper. So I would like to ask if there are any changes in the code for the two tests of SEED and seed-indep.

As far as I remember there are only adjustments for the parameters, the others are unchanged. Perhaps you could also try changing the axis to 1 at 135 line in load_data.py to standardize the sample dimensions.

Hello author, I have found the problem and solved it, thank you very much for your reply all the time. I wish you every success in your work