csndl-iitd / realtime-sleep-staging

Using ML for identification of sleep stages in real time in humans
MIT License
0 stars 0 forks source link

3D CNNs based approach #10

Open gsaurabhr opened 3 weeks ago

gsaurabhr commented 3 weeks ago
gsaurabhr commented 3 weeks ago

Dataset needs to have high density EEG. We found only one such dataset: https://github.com/csndl-iitd/realtime-sleep-staging/issues/5#issuecomment-2126712946

gsaurabhr commented 3 weeks ago

Basic data preprocessing:

  1. Down sample to 100 Hz
  2. Apply band pass filter between 0.2 Hz and 40 Hz (which also removes all line noise)
  3. Re-reference to mastoid electrodes (what are the electrode names?)
  4. Epoch the data
    • Use the annotations to detect 30 second epochs that are manually labeled
    • Generate individual .npy files for each epoch with all 64 electrode pre-processed data and save with appropriate filename
Tanvig commented 3 weeks ago

Paper

Tran, D., Wang, H., Torresani, L., Ray, J., LeCun, Y., & Paluri, M. (2018). A closer look at spatiotemporal convolutions for action recognition. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition (pp. 6450-6459).

Useful GitHub repositories:

Summary:

image
Tanvig commented 3 weeks ago

The below GitHub repositories use 3D-CNN for EEG data. They might be useful when we convert EEG data into image frames for our model.

Tanvig commented 2 weeks ago

7 Basic Data Visualizations

Data information:

Dataset Description
Information Comments
Number of subjects 19 Performed two different cognitive tasks on two different days before napping. Link: https://osf.io/zcu2w
Number of recordings 36 (2-night recordings of 17 subjects and 1-night recording of 2 subjects) Link: https://github.com/nmningmei/Get_Sleep_data/blob/main/data/available_subjects.csv
Number of channels 64 62 EEG + 2 EOG Link: https://osf.io/ebvsr
Original sampling frequency 1000 Hz  
Original highpass and lowpass filters highpass: 0.0 Hz lowpass: 500.0 Hz  

Data preprocessing steps:

Visualizations

The below visualizations are for single subject 29 (day 1)

This notebook contains all the visualizations and details: https://colab.research.google.com/drive/1QYWn7DLtCCCWRf5erdWIgdv6hdFR9-xH?usp=sharing

Before pre-processing

image

image

Hypnogram

image

After pre-processing

image

image

image

image

image

image

image

Tanvig commented 6 days ago

tSNE plot for all subjects

The below tSNE plots were created using 30 seconds epochs from all the subjects. These epochs are given as input features to the tSNE plots. The data is z-scaled before plotting. The three plots show varying level of perplexity (parameter relating to the number of nearest neighbors).

tsne_all_subjects_perplexity30

tsne_all_subjects_perplexity50

tsne_all_subjects_perplexity75

UMAP plots for all subjects

The below UMAP plots were created using 30 seconds epochs from all the subjects. These epochs are given as input features to the UMAP plots. The data is z-scaled before plotting. The two plots show varying levels of n_neighbours (parameter that balances local versus global structure in the data). The last one shows 3 dimensional plot.

UMAP_all_subjects_neighbours15

UMAP_all_subjects_neighbours8

3d_umap