Closed arnodelorme closed 2 years ago
Thanks, Arno.
Here's what I notice. Subject 2 has the following number of events, as indexed by "cell2mat({EEG.event.type})":
1 has 181 events (stimulus onset, Easy Trial) 2 has 149 events (stimulus onset, Hard Trial) 3 has 152 events (Correct Response, Easy) 4 has 141 events (Correct Response, Hard) 9 has 19 events (Incorrect Response, Easy) 10 has 21 events (Incorrect Response, Hard) 254 has 1 events (Start task code)
This is a basic Flankers task with 2 events per trial (stimulus onset and the subject response) and 330 trials.
After running EEG = clean_artifacts(EEG,'FlatlineCriterion',5,'ChannelCriterion',0.8,'LineNoiseCriterion',4,'Highpass',[0.25 0.75],'BurstCriterion',10,'Distance','Euclidian','WindowCriterionTolerances',[-Inf 7]), the events count will match the original file's events count.
If I run the full clean_artifacts function from the GUI with BurstRejection 'On' and Window Criterion set to 0.25), then the events are converted to cell arrays, probably because of 'boundary'. In any case, I can only count the events using '{ EEG.event.type }':
1 has 180 events 10 has 5 events 2 has 134 events 3 has 82 events 4 has 87 events 9 has 6 events boundary has 345 events
Thus, the full default clean_artifacts removed a large portion of subject response codes without removing many stimulus codes, but also added a tremendous number of "boundary" event flags. Using a more lax or more aggressive value for Window Criterion does not help the situation.
Data is collected with a Biosemi system and 34 channels of EEG plus VEGO+, VEGO-, HEOG+, HEOG-, Mastoid 1, Mastoid 2. I’m using eeglab2019b on Matlab 9.7.0.1190202, 64-bit.
I have checked and using a higher value is less agressive.
Setting manually 'WindowCriterion' to 0.4 instead of default 0.25 tends to reject more data (according to Greg Perlman). This is in contradiction with the documentation which indicate that "Generally a lower value makes the criterion more aggressive"