sccn / clean_rawdata

Cleaning Raw EEG data
GNU General Public License v3.0
42 stars 17 forks source link

Memory error #41

Closed neuroro closed 1 year ago

neuroro commented 2 years ago

Hi there

I am getting "Not enough memory" errors running clean_rawdata 2.6 via the EEGLAB 2021.1 interface to do artifact subspace reconstruction (just actual ASR burst correction, none of the other functionality of clean_rawdata). I've tested this on two different datasets (similar length, one is 410s at 1000 Hz) on two different machines (one with 128GB RAM and one with 8GB RAM). EEGLAB reports the error occurring in asr_process (line 132), which is where it checks maxmem. It seems to be due to maxmem defaulting to 64 in clean_artifacts (line 195) and/or in clean_asr (line 137), which go into asr_process. I notice that without input, asr_process (line 110) and asr_calibrate (line 121) dynamically calculate maxmem as: maxmem = hlp_memfree/(2^21) I've got it 'working' via the GUI by replacing the 64 with [] in clean_artifacts (line 195) and in clean_asr (line 137) and adding || isempty(maxmem) to the if statement on line 120 of asr_calibrate, to make the code use hlp_memfree instead. Is this reasonable? (Obviously it can be solved by not using the EEGLAB interface, but I would like to include ASR in a pre-processing pipeline usable by non-coders)

Cheers! Rohan

arnodelorme commented 2 years ago

Apologies for the late response. I was traveling. Would you mind sharing your dataset so I can try to fix the issue?

neuroro commented 2 years ago

Hey, all good. I'm not sure I can share my dataset due to ethics, but I have replicated the situation using the EEGLAB sample data scaled up to 128-channels at 1000 Hz, which is what we record (see the attached script in the zip). Running ASR on that gives a memory error (via the command line). I've confirmed this with my colleague. I'm assuming it's the maxmem 64mb default being too small? Cheers for all of this! ASRtest.zip

arnodelorme commented 1 year ago

The problem with using the available memory is that clean_rawdata becomes non deterministic (the results are not reproducible). This is why the default was set to 64. Note that if I try to run your code, MATLAB wants to use more than 64Gb of RAM (killed it after that so my machine would not crash).

neuroro commented 1 year ago

Ah interesting, that makes sense! So, a larger number such as 256 should solve the out of memory error deterministically yeah. Thank you for following up :) (Also my apologies, I must have written that script on the machine I use with ample RAM and so not noticed it asking for 64GB!)

arnodelorme commented 1 year ago

Yes, it should Let us know.

arnodelorme commented 1 year ago

I have implemented a fix to increase memory when it is too low. Let me know. See 303c735

neuroro commented 1 year ago

All working, thank you! And that's an elegant fix!