MiaoZhengQing / LMDA-Code

code for LMDA
60 stars 11 forks source link

data loading error at data_loader.py #2

Open dotchang opened 6 months ago

dotchang commented 6 months ago

Thank you for sharing interesting code.

I prepared BCICIV_2a_gdf data from original "https://bbci.de/competition/iv/download/". Especially, "Data sets 2a: [ GDF files zipped (420 MB) ]" And I downloaded A04T.mat and A04E.mat from "https://bnci-horizon-2020.eu/database/data-sets".

And I tried to run data_loader.py with small modifications that related to mne (raw_gdf.info["gdf_events"] = gdf_events -> raw_gdf.info["temp"]["gdf_events"] = gdf_events), I got these errors,

C:\Users\yoshi\LMDA-Code>python data_loader.py Reading 0 ... 600914 = 0.000 ... 2403.656 secs... Used Annotations descriptions: ['1023', '1072', '32766', '768', '769', '770', '771', '772'] Traceback (most recent call last): File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 159, in train_cnt = train_loader.load() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 16, in load events, artifact_trial_mask = self.extract_events(cnt) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 85, in extract_events classes = loadmat(self.labels_filename)["classlabel"].squeeze()


KeyError: 'classlabel'
-----

So, do I use wrong data? Could you give us any hints for data downloading for preparation?
dongliang0 commented 3 months ago

Info does not support directly setting the key 'gdf_events'. You can set info['temp'] to store temporary objects in an Info instance, but these will not survive an I/O round-trip.

same problem

lyzzh commented 2 months ago

Thank you for sharing interesting code.

I prepared BCICIV_2a_gdf data from original "https://bbci.de/competition/iv/download/". Especially, "Data sets 2a: [ GDF files zipped (420 MB) ]" And I downloaded A04T.mat and A04E.mat from "https://bnci-horizon-2020.eu/database/data-sets".

And I tried to run data_loader.py with small modifications that related to mne (raw_gdf.info["gdf_events"] = gdf_events -> raw_gdf.info["temp"]["gdf_events"] = gdf_events),

I got these errors,

C:\Users\yoshi\LMDA-Code>python data_loader.py

Reading 0 ... 600914 = 0.000 ... 2403.656 secs... Used Annotations descriptions: ['1023', '1072', '32766', '768', '769', '770', '771', '772'] Traceback (most recent call last): File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 159, in train_cnt = train_loader.load() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 16, in load events, artifact_trial_mask = self.extract_events(cnt) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\yoshi\LMDA-Code\data_loader.py", line 85, in extract_events classes = loadmat(self.labels_filename)["classlabel"].squeeze()


KeyError: 'classlabel'
So, do I use wrong data? Could you give us any hints for data downloading for preparation?

I met the same question Have you solved the question?

maryeol commented 2 months ago

I have downloaded GDF files from https://bbci.de/competition/iv/download/ and MAT format from https://www.kaggle.com/datasets/reader443/bci-competition-iv-dataset-2a-in-mat-format But I am also encountering same issue of KeyError: 'classlabel'. after some trouble shooting, I have notices that the MAT files has different keys and when I looked into the data content, I don t think the class label is included in the object. Maybe we need to download the class label from another source. but it will be so complicated eso we need to map the data and the labels. so it would be more helpful if you provide us with the source from where you downloaded the dataset. thanks!

MiaoZhengQing commented 2 months ago

Thank you all for your attention and comments. My gut feeling is that the data reading error is due to an MNE version mismatch, as the MNE version I used in my experiments was relatively low. I have two suggestions:

  1. Downgrade the MNE version.
  2. For higher MNE versions, use new functions to read the data.

I apologize as my hands are full nowadays, and I can no longer access the server from my previous affiliation to verify the MNE version. I will add this issue to my to-do list. Thank you for your understanding.